I have got many requests on how to create Pin Authorization for specific users on lync to be able to dial international calls.
Here’s a sample case scenario and print screen of the normalization rule that meets this case:
- User has to type pin number before any dialed international number, the pin number is #9090 and that will be shared with the user by the administrator in secured email.
Scenario 1: User want to dial 0097466040129 he just type it directly , notice the highlighted test result below call will not succeed as it’s not normalized:
Scenario 2: User has to types the number in this format #9090009746040129 , notice the highlighted test result below the resulted number is +97466040129 call will succeed as it’s normalized correctly.
The above is just an example that can be tailored according to different environments.
Notes:
- The above normalization rule should be duplicated for pin+E164 typed numbers.
- Another normalization rule might be required to truncate the E164 numbers that are directly typed by users to bypass the pin enforcement. this rule will be deleting international numbers typed in E164 without the pin.
I hope it helps !
This is the first post in a series to take you through my learning experience to using the Windows Azure service bus. The Service Bus is a concrete implementation of the service bus pattern designed to operate at Internet scope within highly-scalable Microsoft data centers as a constituent of the Windows Azure platform. The Service Bus provides a federated identity and access control mechanism through Access Control, a federated naming system, a dynamic service registry, and a robust messaging fabric capable of overcoming the internet and distributed services connectivity. The internet challenges that any distributed service platform face are summarized in the below diagram.
The way that the service bus overcomes these challenges is by implementing a relay service pattern as below.
Here’s how it works: the on-premises service connects to the relay service through an outbound port and creates a bidirectional socket for communication tied to a particular rendezvous address. The client can then communicate with the on-premises service by sending messages to the relay service targeting the rendezvous address. The relay service will then “relay” messages to the on-premises service through the bidirectional socket already in place. The client does not need a direct connection to the on-premises service nor does it need to know where it resides. The on-premises service doesn’t need any inbound ports open on the firewall. This is how most instant messaging applications work today.
A central component of the Service Bus messaging fabric is a centralized (but highly load-balanced) relay service that supports a variety of different transport protocols and Web services standards, including SOAP, WS-*, and even REST. The relay service provides a variety of different relay connectivity options and can even help negotiate direct peer-to-peer connections when possible.
So in this article I will show you how to develop your first hello service hosted on the service bus. The major steps can be summarized as below.
Step 1: Create a new service bus namespace on the Azure service bus.
Step 2: Implement the service and client.
Step 3: Configure the service and client to use the required relay binding configuration.
So as you can see it is as easy as 1, 2, and 3. Or is it?! J
So you need to go to the URL: https://windows.azure.com/ that will request you either to login using a live ID that has already an active Azure subscription or to sign up. If you already have an account then just logon, or if not you can click sign up.
That would lead you to the purchase screen of subscription options, study your options well and pick whatever type you want or (even better) click on the FREE trial.
Then you will be asked to login using an existing live ID then you will need to go through three steps as below:
Then in the third step you would need to enter a credit card just as a form of verification and voila you have a three month trial subscription on Azure.
Next logon to your management portal using you live ID on the URL: https://windows.azure.com/ it should look something like this.
Now to create a new namespace do the following steps.
· Click on “Service Bus, Access Control & Caching” from the bottom left.
· Then click on “Service Bus”
· Then click “New”
· Enter the namespace name and the portal will check if it is available or not for you and select the needed details as below:
· Click “Create” and the namespace would be ready to use.
· Now we will need three details to get started, the service hosting URL, the secret issuer name and issuer secret. The service hosting URL would be dependent on the namespace you already created so for the shown above example the URL would be: https://MyAzureHello.servicebus.windows.net.
· To get the secret details click on the created namespace.
· Then scroll the right action pan all the way down and click on “View” for the “Default Key”.
· It will then give you the option to copy both the issuer name and the issuer secret to the clipboard. Do this one by one and keep this information were you can use it later.
This is a straight forward step. Just create two console applications, one for the service host and another for the client and create a service definition as below.
[ServiceContract]
public interface IHelloServiceBus
{
[OperationContract]
string SayHello(string name);
}
Listing 1: Service Contract
public class HelloServiceBus : IHelloServiceBus
public string SayHello(string name)
string greeting = string.Format("Hello {0}!", name);
Console.WriteLine("Returning: {0}", greeting);
return greeting;
Listing 2: Service Implementation
static void Main(string[] args)
Console.WriteLine("**** Service ****");
ServiceHost host = new ServiceHost(typeof(HelloServiceBus));
host.Open();
Console.WriteLine("Press [Enter] to exit");
Console.ReadLine();
host.Close();
Listing 3: Host Implementation
Console.WriteLine("**** Client ****");
Console.WriteLine("Press <Enter> to run client.");
Console.WriteLine("Starting.");
ChannelFactory<IHelloServiceBus> channelFactory =
new ChannelFactory<IHelloServiceBus>("webRelay");
IHelloServiceBus channel = channelFactory.CreateChannel();
for (int i = 0; i < 10; i++)
string response = channel.SayHello("Service Bus");
Console.WriteLine(response);
channelFactory.Close();
Listing 4: Client Implementation
Now we need to configure both the client and service to use the service bus bindings.
All you need to do is to put the proper Azure service bus configuration. But before you do that how would the service and client implementation know where to get the binding implementation from? You did not add any custom references, right?! So here is how.
· Make sure both the client and service is using the .NET 4 profile (not the client profile).
· The go in Visual Studio to tools and then extension manager:
· Click on line gallery and search for “NuGet”
· Click download and hence install the NuGet Package Manager.
· Close the extension manager.
· Right click on the references node for the service project and click “Manage NuGet Packages”.
· Click online.
· Search for “Azure” and select the “Windows Azure Service Bus” and click “Install”.
· Install the same NuGet package to the client project.
· Now that you have the Azure assemblies in place you can change the service and client configurations as below.
<system.serviceModel>
<services>
<service name="Service.HelloServiceBus">
<endpoint address="https://momalek.servicebus.windows.net/helloservicebus" behaviorConfiguration="sharedSecretClientCredentials" binding="ws2007HttpRelayBinding" contract="Service.IHelloServiceBus"/>
</service>
</services>
<behaviors>
<endpointBehaviors>
<behavior name="sharedSecretClientCredentials">
<transportClientEndpointBehavior credentialType="SharedSecret">
<clientCredentials>
<sharedSecret issuerName="[Issuer name retrieved before]" issuerSecret="[issuer secret retrieved before]"/>
</clientCredentials>
</transportClientEndpointBehavior>
</behavior>
</endpointBehaviors>
</behaviors>
</system.serviceModel>
Listing 5: Service Configuration
<client>
<endpoint address="https://momalek.servicebus.windows.net/helloservicebus" behaviorConfiguration="sharedSecretClientCredentials" binding="ws2007HttpRelayBinding" contract="Service.IHelloServiceBus" name="webRelay"/>
</client>
Listing 6: Client Configuration
Now you are ready to start the service and (wait for it to properly start, as it take some time – couple of minutes or so) the client and watch them communicate through the Azure service bus.
During this exercise I tried many bindings and I must say that the most reliable one I used was the “ws2007HttpRelayBinding” (I mean reliable from the perspective of being able to start hosting the service with no problems).
Hosting a service behind a proxy (specially a proxy that requires authentication) is not supported and does not workL. Check this URL: http://msdn.microsoft.com/en-us/library/windowsazure/ee706729.aspx.
Visual Round Trip Analyzer (VRTA) is a tool that helps the SharePoint Administrator identify what is being downloaded at a web page level.
One of the biggest complaints from users is the response time.
VRTA excels in showing the network round trip relationship between the client and the server.
This is also critical to the well-being of a farm. While an administrator can optimize the server response, there are several other parties that can inadvertently be working against this:
All of these listed parties create solutions using SharePoint Designer, Notepad, and possibly Visual Studio, and the administrator would have no knowledge of this. But in the end, the administrator is the person who will get the support call.
Using VRTA, the administrator can identify the bottlenecks and involve the right parties.
You must have VRTA loaded on a PC (free download from the Microsoft Download Center). Netmon 3.4, also a free download, needs to be loaded on the PC. These tools should not be run on servers but on local machines. No special permissions are needed and it can be run against a public site.
VRTA uses Microsoft Network Monitor 3.4 packet analyzer as its foundation. Visually, it shows files and packets, along with the round trip information that occurs between a client and server.
When evaluating page loads, several factors should be taken into account:
Using the four tabs, Main Chart, Statistics, All Files, and Analysis, the data the page is retrieving and loading can be seen in detail. In the preceding screenshot, every file that is loaded shows how long to load, the port, the type of file, a status code, and size.
The administrator can observe the assets that are being used and be able to offer recommendations such as creating a sprite instead of loading each individual image, or combining JavaScript files. Hovering over each detail item will present further detail on the individual asset.
VRTA also has an Analysis tab that acts as a best practice guide. It grades the files and page on several basic factors such as an average file size rule, white spaces rule, and image clustering rule. Using a color-coded scheme, it makes recommendations to help you improve performance.
Finally, every time a recording is made, it is saved in a directory by default, whose path can be seen in the title of the VRTA application.
Introduction
Hello again. In the first part we have done an introduction to TFS 2010 and installation according to dual-server topology. In this post, we will talk about migrating existing TFS databases into TFS 2010 platform. We shall start!
Firstly, before we are getting into details of importing upgrade, I would like to mention other types of scenarios you might encounter:
How-To
So, here we have TFS 2010 that is up and running well and now wants to import content of TFS 2008 server (applies to TFS 2005 as well). One of the biggest differences between 2010 and old version is that 2010 is based on Team Project Collections (TPCs): One collection reflects a single database and may have 0 or more team projects, independent from other TPC.
Steps:
Picture 1: Arguments of tfsconfig.exe
Data Model in TFS 2010
Databases used by TFS 2010 with some notes
Picture 2: TFS 2010 SQL Database, Tfs_Warehouse
Health Check
After importing it would be wise to re-run Best Practice Analyzer, just you do before importing process. To do so,
Picture 3: TFS 2010 Best Practices Analyzer running
Conclusion
In this second part, we have executed a scenario of importing TFS 2005/2008 databases into TFS 2010 Server by using tfsconfig command. Hope you liked it.
Refererences
I recently came across a requirement to implement a Windows Communication Service that acts as a message router across several other WCF backend services. This router service main functionality is to encapsulate the backend services and also to add automatic request monitoring. This should not be hard I was thinking but there are couple of twists, first the router service should not sensitive to the services themselves and how they change and communicate with their clients. It also will be handing encrypted messages that it cannot decrypt as these messages belong to other partners that we do not control and of course do not have the certificate used for encryption.
The way this is done is by implementing custom WCF message dispatch inspector and custom WCF message client message inspector behaviours. Why do we need both as our service acts as a WCF to all the clients and as a client to all the backend services. So the main idea is to when a message is received over the wire the entire message body is escaped and placed in a custom header element and before this message is sent to the backend the original message body is extracted from the header and written back to the message sent. While the response is being received from the backend service the same logic happens but in the reverse order. So the reply is written to the custom header as an escaped string and then before this reply is delivered to the original caller the reply is read from the header and sent to the client. Simple right J well it will get even better; since we want the original message unparsed and not changed coming on the wire this has to be done using a custom message encoder for both ways, receiving requests from the clients and receiving responses from the backend services. So to try to make this simple I placed the following diagram.
So you need to implement a custom message encoder inheriting from the class “MessageEncoder” and mainly implement the method “ReadMessage” to be as follows.
public override Message ReadMessage(ArraySegment<byte> buffer, BufferManager bufferManager, string contentType)
byte[] msgContents = new byte[buffer.Count];
Array.Copy(buffer.Array, buffer.Offset, msgContents, 0, msgContents.Length);
bufferManager.ReturnBuffer(buffer.Array);
MemoryStream stream = new MemoryStream(msgContents);
Message message = ReadMessage(stream, int.MaxValue);
string ns = "http://Somenamespace";
MessageHeader header = MessageHeader.CreateHeader("OriginalFullMessage", ns, UTF8Encoding.UTF8.GetString(msgContents));
message.Headers.Add(header);
return message;
So you need to implement first a custom class inheriting from “IDispatchMessageInspector” and it should first override the method “ApplyDispatchBehavior” to add this dispatch behavior to the endpoint as follows.
public void ApplyDispatchBehavior(ServiceEndpoint endpoint, EndpointDispatcher endpointDispatcher)
endpointDispatcher.DispatchRuntime.MessageInspectors.Add(this);
endpointDispatcher.AddressFilter = new System.ServiceModel.Dispatcher.MatchAllMessageFilter();
Then you need to implement the method “BeforeSendReply” to put the original reply back on the wire as follows.
public void BeforeSendReply(ref System.ServiceModel.Channels.Message reply, object correlationState)
string origMsg = OperationContext.Current.RequestContext.RequestMessage.ToString();
string fullMessageHeader = "OriginalFullMessage";
int fullMessageHeaderIndex = reply.Headers.FindHeader(fullMessageHeader, ns);
if (fullMessageHeaderIndex >= 0)
origMsg = UnescapeXml((reply.Headers.GetHeader<string>(fullMessageHeaderIndex)));
Message newreply = Message.CreateMessage(MessageVersion.None, reply.Headers.Action, new SimpleMessageBody(origMsg));
newreply.Headers.To = reply.Headers.To;
reply = newreply;
return;
You will then need to implement the method “BeforeSendRequest” from the class “IClientMessageInspector” to put the original request back on the wire before we send it to the backend services.
public object BeforeSendRequest(ref System.ServiceModel.Channels.Message request, System.ServiceModel.IClientChannel channel)
int fullMessageHeaderIndex = request.Headers.FindHeader(fullMessageHeader, ns);
origMsg = UnescapeXml((request.Headers.GetHeader<string>(fullMessageHeaderIndex)));
Message newRequest = Message.CreateMessage(MessageVersion.None, request.Headers.Action, new SimpleMessageBody(origMsg));
newRequest.Headers.To = request.Headers.To;
request = newRequest;
return null;
And that’s it J
Usually when writing custom BizTalk pipeline components you find yourself wanting to extract specific values from the message passed using Xpath statements.
You can do this either by XPathDocument or XDocument, but this solution would require loading the entire XML into memory and if the XML file is huge that can be not possible. Also it makes the pipeline component slower. The solution is to use a streamed class such as XMLReader. But that would be too much work to do, right?
The solution comes in the form of a hidden GEM in the BizTalk installed components, called the XPathReader. This is a stream based class that would search for a node or element using the given set of XPath strings.
This class is defined in the assembly Microsoft.BizTalk.XPathReader.dll deployed to the GAC. You need to add a reference to this assembly first and then use the class as below.
MsgStream.Seek(0, SeekOrigin.Begin);
XmlReader reader = XmlReader.Create(MsgStream, settings);
string strValue = null;
if (!string.IsNullOrEmpty(MsgXPath))
XPathCollection xPathCollection = new XPathCollection();
XPathReader xPathReader = new XPathReader(reader, xPathCollection);
xPathCollection.Add(MsgXPath);
if (xPathReader.ReadUntilMatch())
if (xPathReader.Match(0))
strValue = xPathReader.ReadString();
Where the MsgStream is a seekable steam obtained from the message.
Below is the order I followed recently to install and configure a 4-server SharePoint Farm.
* Update: I was discussing the order of installation with a colleague and got the news that the Dec 2011 CU Hotfix package (2597014) "The full server package for SharePoint Server 2010 and contains also the MSF2010 [SharePoint Foundation] fixes so you need only this one package." - no need to apply the CU for SharePoint Foundation.
Good luck!
In real SOA implementation, you will be probably exposing many WCF Services that you need to secure. Many blogs are there around STS and WCF, but non of them which guide you through a basic implementation of a custom STS using Windows Identity Foundation (WIF) to secure your WCF services. If you are just starting with STS/WIF or you have spent sometime trying to implement a basic STS with no luck, this blog is just for you. I assume you are already familiar with what's STS as an authentication mechanism, WCF in general and Visual Studio. When you finish all the steps, you should have an STS, console client and a WCF service that uses STS for authentication.
I've added the images to allow you to follow easily with the steps.
This demonstration assumes that you will code only on 1 machine which will contain the console client, the service and the custom STS. I use a machine named VS2010 on domain contoso.com
You will need to prepare a single certificate that will be used in signing and encryption of the STS tokens. to do so, create a self-signed certificate from the IIS, then use this certificate to encrypt traffic of the default website of IIS (SSL).
This step is critical. If you pass it, your STS will probably consider anonymous users only for authentication.
You should be able to view the content of that XML file in your browser. This ensures that your STS is created successfully.
Now we will configure the WCF service to use the STS for security.1. Right click on the project “Secure WCF Service” and click “Add STS Reference”
2. In the first screen, leave all defaults and click “Next”.
3. In the second screen, leave all defaults as well and click “Next”.
4. In the next screen, select “Use an Existing STS”.5. In the “STS WS-Federation Metadata document location” box, type the address of the STS Federation metadata file. It should be something like “http://vs2010.contoso.com/DemoSTS/FederationMetaData/2007-06/FederationMetaData.xml”. Then click Next.
6. Click “Enable Encryption”, the click “Select an Existing Certificate from Store”
7. Click “OK” when the certificate is selected then click “Next”8. In the next Screen click “Next”9. In the final screen click “Finish”. Make sure that the checkbox “Schedule task to perform…” is unchecked.
1. Open the “web.config” file of the “DemoSTS” project2. Modify the “appSettings” section to specify the certificates that you want to use for the encryption and signing. After you modify it, it should look something like this…
Note: In the above example, I use the same certificate for both signing and encryption. IN real life scenarios, those certificates should be different.
3. Save the “web.config” file and close it.4. Right click on the project “SecureWCFService” and click “Update Federation Meta Data”
Now, we will create a console application and modify the service code to read the claims in the STS token1. Now add a simple console application to the same solution to make its structure to look like this:
2. Right-click “DemoConsole”, the console application project You’ve just added and select “Add Service Reference”:
3. In the “Add Service Reference” dialog window click “Discover” and when “Address” text box is populated with the WCF service address you have created. Double click service name in the “Services” list box, on the picture bellow it is “SecureWCFService/Service.svc”. Then click “OK”:
4. Now test that client and service actually work by populating “Main” method of console application “Program” class with following code:
5. Modify the “GetData” function in the service code also to list all claims. The code should look like the following…
6. Now, compile everything and run the console application.7. Set console application as start-up project and press F5. You should receive the following console prompt after a while:
Happy Coding:)
I’ve been configuring TMG auto client proxy detection and it was failing. After researching it online and internally, didn’t find anything useful.
So when I installed the TMG client and run this test:
C:\Program Files (x86)\Forefront TMG Client>FwcTool.exe testautodetect FwcTool version 7.0.7734.100 Forefront TMG Client support tool Copyright (c) Microsoft Corporation. All rights reserved. Action: Test the auto detection mechanism Type: Default Detection details: Timeout is set to 60 seconds Locating WSPAD URL on the Active Directory server WSPAD object was found in the global Active Directory container WSPAD URL found on the Active Directory server: http://XXX.YYY.com:8080/wpad.dat
Initializing Web server connection Resolving IP addresses for XXX.YYY.com Resolved 1 address(es): 10.0.110.110 Connecting to address #1: 10.0.110.110:8080 Waiting for address #1 to connect Address #1 successfully connected Requesting wspad.dat file Web server is connected and ready to send WSPAD file Downloading WSPAD file WSPAD file was downloaded successfully WSPAD file is in bad format Failed to detect Forefront TMG Result: The command failed and was not completed.
So in able to fix that, I’ve changed the SCP from wpad to wspad using adsiedit (or you can use the tmgadconfig tool again with –f switch)
Hope this helps!
M
Below are the operational tasks which should be done on a regular basis, . There is still work to be done on deciding who will perform each task.
Daily Tasks
The following tasks should all be carried out on a daily basis:
Weekly Tasks
These tasks should be carried out every week:
Monthly Tasks
Take care of these tasks once a month:
As Needed
The following tasks should be performed as necessary; however, they are frequently also covered by standard procedures:
Additional Tasks (may be farmed to a Server Operations Group)
The following tasks are general server-based tasks that may be executed by a server operations team (or other) and not necessarily specific to SharePoint Server services .
tempdb data files and transaction logs on the fastest disks Content database data files Search databases, except for the Search administration database Content database transaction log files
* updated the post as per the guidance from the technet article http://technet.microsoft.com/en-us/library/cc298801.aspx (Storage and SQL Server capacity planning and configuration)
After installing SQL server on a machine, it happens that you connect or disconnect that machine to domain. WHen you do this, the administrator account can no longer access the database engine. The below steps allow you to regain access to the SQL server...