The new RestrictedAdmin RDP – Security Trade-Off and Pass-the-Hash Exposure

Windows 8.1 and Windows Server 2012 R2 come with a new feature called RestrictedAdmin RDP feature in which the credentials are not stored on the remote computer anymore.

I read about many arguments on the internet about weather this is a good security feature or something that can makes you vulnerable to pass-the-hash attacks. In this blog post, i will try to share with you my thoughts.

To start talking about this hot topic, i will start by comparing Interactive Logon vs Network Logon.

Interactive Logon

  • John inputs his credentials to the machine by entering his username and password.
  • The machine checks if the credentials are right by contacting a domain controller using (Kerberos by default, or NTLM when kerberos is not available)
  • If the domain controller approves that identity, the user is authorized to access the machine and a Single-Sing On (SSO) data is stored on that machine. This can be a Ticket Granting Ticket TGT or NTLM hash of the user password. SSO data is stored in memory and is required to ensure Single Sign On experience for John so he can access network resources without the need to type his credentials over and over.


Network Domain Logon

  • So now John logs on to his machine using Interactive Logon and has his SSO data is stored in memory as shown the previous figure.
  • When John wants to access a network resources like a remote file share using Network Domain Logon, an SSO token derivative  (a Kerberos TGS ticket or a challenge encrypted with the NTLM hash) is used to prove the user’s identity to the target machine.
  • The target machine uses the Domain Controller to validate the authenticity of the SSO derivative and to receive authorization data for the user.  It’s important to note that the SSO token itself does not leave the user’s machine and specifically, it is not sent to the target machine.

Which one is better?

  • From John’s machine perspective, Network Logon is better because when he access a network share, he can do that using Network Logon and his actual SSO data is not sent to the target server, and thus Network Logon reduces the user’s exposure to pass-the-hash attack.
  • From the remote server perspective, allowing Network Logon on it means that an attacker that has access to user hash, can use Network Logon to access it. On the other hand, if that server does not allow Network Logon, then pass-the-hash attack is not possible. In other words, a server that does not allow network logon, is not vulnerable to pass-the-hash attack.

How normal RDP connection works (without /RestrictedAdmin)?

Prior to Windows 8.1, the only way to connect and authenticate to a remote computer using RDP was with the Remote Interactive Logon Process.

  • John enters his credentials to the RDP client.
  • RDP client performs Network Logon to the target server to authorize the John.
  • Once John is authorized, the RDP client securely relays the credentials to the target machine over a secure channel.
  • The target server uses there credentials to perform an Interactive Logon on behalf of John.

Note: the remote server should gain access to the actual credentials to allow remote desktop connection.

How RestrictedAdmin  RDP connection works ?

Using this mode with administrative credentials, RDP will try to interactively logon to the remote server without sending credentials. RestrictedAdmin mode does not at any point send plain text or other re-usable forms of credentials to remote computers.

This means that if an attacker has only the hash of the password, he can access a remote computer using RestrictedAdmin mode as now the actual credentials are not a requirement to establish the connection. While without using RestrictedMode,  knowing the actual credentials is a must.

In other words, Network Authentication is used heavily when using RestrictedAdmin RDP, which means that either NTLM or Kerbeors will work by default.

Previously, if you know the admin hash, you can pass-the-hash with psexec tool and take over the remote system if SMB/RPC (ports 445,135,139,,) were exposed. But because many administrators already block these ports leaving only RDP inbound connection allowed, now the attacker can pass-the-hash using the RDP protocol.

Public key Infrastructure from IT and Business Perspective

Imagine that the internet is a city, it would be the most crowded city in the world. Inside this city, you would also discover that not everyone is who they seem to be even yourself.

Internet like a city

Inside a small company and with face to face interactions, you would use badges with pictures and names on them to identify people working in the company. If the badge has the company’s logo, then you can assume that the person is authentic.


When it comes to digital collaboration and e-commerce transaction, you usually to deal with people who you did not meet before, maybe located at the other side of the word, but yet you need a way to verify their identity and perhaps send them information that no one should see across the open internet.

Public Key Infrastructure is a framework that helps identify and solve these problems for you by establishing safe and reliable environment for electronic transactions in the internet. It uses public key encryption techniques to protect the confidentiality, integrity, authenticity and non-repudiation of data.

PKI Framework

People and services in the internet are issued a digital certificates that uniquely identify them in the digital word, much like the corporate badge with your photo and name on it.

The Certificate Authority is the component responsible of issuing digital certificate after verifying the identity of the requester. If you trust the certificate authority, then you can trust digital certificates it issues.

A certificate authority maintains a revocation list that contains all digital certificates cancelled or suspended before their expiry dates.

Each digital certificate contains a pair of keys. A private key kept secretly by the holder of the digital certificate and corresponding public key which is known to others.

Digital Certificate Picture

This pair of asymmetric but matching keys will be sued for data encryption to ensure confidentiality.

Take email message transmission as an example. A sender can use the intended recipient’s public key to secure the content of an email message. When the recipient receives the message, he will need to use the corresponding private key that he keeps to unsecure the message. By doing so, the confidentiality of the email content will be secured.

Furthermore, integrity, authenticity and non-repudiation of the email message can happen by creating a message digest to ensure the message is not altered during transmission.

Public Key infrastructure enables a wide variety of technologies, like SSL for secure browsing and transactions, enhance your wireless security by implementing industry standard verification and authentication, secure remote access to your enterprise network. In addition, you can start providing encryption and digital signature to your corporate email communication, and maybe encrypting your sensitive documents on your local drives. As passwords are very basic method for authentication, two factor authentication is the best way to raise your authentication level by implementing smart cards.

Public Key Infrastructure Technologies

Watch this post as a YouTube Video






Exchange 2016 Hybrid : TLS negotiation failed with error UnknownCredenta


I was adding couple of Exchange 2016 servers with CU2 to the Hybrid configuration wizard to send and receive emails to Exchange Online. On Exchange Online Admin center, I configured the receive connector to Office 365 o verify the subject name on the certificate for TLS authentication.

The problem is that emails are not being sent to Office 365 via the send connector. After enabling the protocol logging on my Exchange 2016 hybrid servers [Get-SendConnector “outbound to Office 365” |Set-SendConnector -ProtocolLoggingLevel verbose] , and opening the smtpsend log file, I can see many TLS failures:

016-07-19T12:13:14.863Z,Outbound to Office 365,08D3AFC581A92DD3,3,,,>,EHLO,
2016-07-19T12:13:14.910Z,Outbound to Office 365,08D3AFC581A92DD4,2,,,<,”220 Microsoft ESMTP MAIL Service ready at Tue, 19 Jul 2016 12:13:14 +0000″,
2016-07-19T12:13:14.910Z,Outbound to Office 365,08D3AFC581A92DD4,3,,,>,EHLO,
2016-07-19T12:13:15.004Z,Outbound to Office 365,08D3AFC581A92DD3,4,,,<,250 Hello [] SIZE 157286400 PIPELINING DSN ENHANCEDSTATUSCODES STARTTLS 8BITMIME BINARYMIME CHUNKING,
2016-07-19T12:13:15.004Z,Outbound to Office 365,08D3AFC581A92DD3,5,,,>,STARTTLS,
2016-07-19T12:13:15.051Z,Outbound to Office 365,08D3AFC581A92DD4,4,,,<,250 Hello [] SIZE 157286400 PIPELINING DSN ENHANCEDSTATUSCODES STARTTLS 8BITMIME BINARYMIME CHUNKING,
2016-07-19T12:13:15.051Z,Outbound to Office 365,08D3AFC581A92DD4,5,,,>,STARTTLS,
2016-07-19T12:13:15.145Z,Outbound to Office 365,08D3AFC581A92DD3,6,,,<,220 2.0.0 SMTP server ready,
2016-07-19T12:13:15.145Z,Outbound to Office 365,08D3AFC581A92DD3,7,,,*,” CN=*, OU=IT, O=contoso International (L.L.C), L=Dubai, S=Dubai, C=AE CN=thawte SHA256 SSL CA, O=””thawte, Inc.””, C=US 0D92CFF6070B73AD5722EC8B4DA3389B AAA3D3DADA6891A2CCB3134D0B2D7764F1351BC4 *”,Sending certificate Certificate subject Certificate issuer name Certificate serial number Certificate thumbprint Certificate subject alternate names
2016-07-19T12:13:15.145Z,Outbound to Office 365,08D3AFC581A92DD3,8,,,*,,TLS negotiation failed with error UnknownCredentials

I am sure the certificate is fine as the other hybrid servers are using the same certificate and they are able to send emails to Office 365. Also on the event viewer, I am seeing the following error:

TLS Error Office 365 Exchange Hybrid


So finally, I tried something and it worked. I opened the certificate store, and I was checking the permissions on my certificate private key, the certificate I am using for the TLS connection.

TLS Error Office 365 Exchange Hybrid2

I can see the following permissions on the private key:

TLS Error Office 365 Exchange Hybrid3


So I added the Network Service and I gave it READ access. After that everything worked just fine. Try to give EVERYONE Read access if things are not working yet.

Hope this will help someone, leave a note if it did🙂

SharePoint workflow – duplicate emails sent and suspended workflow


I came accross a case in which the workflow should send an email notification to a SharePoint group and continue doing other stuff.

The problem is that the workflow sent the same notification 10 or more times and then the whole workflow get suspended with the error:

RequestorId: cc9d6fb4-dbf3-69a2-0000-000000000000. Details: An unhandled exception occurred during the execution of the workflow instance. Exception details: System.ApplicationException: HTTP 500 {“Transfer-Encoding”:[“chunked”],”X-SharePointHealthScore”:[“0″],”SPClientServiceRequestDuration”:[“5653″],”SPRequestGuid”:[“cc9d6fb4-dbf3-69a2-be79-a2438114b8f6″],”request-id”:[“cc9d6fb4-dbf3-69a2-be79-a2438114b8f6″],”X-FRAME-OPTIONS”:[“SAMEORIGIN”],”MicrosoftSharePointTeamServices”:[“″],”X-Content-Type-Options”:[“nosniff”],”X-MS-InvokeApp”:[“1; RequireReadOnly”],”Cache-Control”:[“max-age=0, private”],”Date”:[“Wed, 08 Jun 2016 11:31:34 GMT”],”Set-Cookie”:[“NSC_tqtjuft.bsbnfy.ofu_iuuqt_jou=ffffffff09131f5945525d5f4f58455e445a4a423660;Version=1;Max-Age=1800;path=\/;secure;httponly”],”Server”:[“Microsoft-IIS\/7.5″],”X-AspNet-Version”:[“4.0.30319″],”X-Powered-By”:[“ASP.NET”]} at Microsoft.Activities.Hosting.Runtime.Subroutine.SubroutineChild.Execute(CodeActivityContext context) at System.Activities.CodeActivity.InternalExecute(ActivityInstance instance, ActivityExecutor executor, BookmarkManager bookmarkManager) at System.Activities.Runtime.ActivityExecutor.ExecuteActivityWorkItem.ExecuteBody(ActivityExecutor executor, BookmarkManager bookmarkManager, Location resultLocation)


It turns out that one of the users in the SharePoint group is disabled and resigned which casues the SharePoint to behave like this



ISO/IEC 27001 – Information Security Management – Lead Implementer


I just got the ISO/IEC 27001 – Information Security Management – Lead Implementer Certification by BSI Group. It was exciting full five days training from BSI group in UK about the Information Security Management System.

The idea behind the course is to help organization build a framework for establishing, implementing, maintaining and Continually improving information security management systems in the context of the organization. Information Security is the preservation of information confidentiality, integrity and availability.

It all starts with identifying the organization internal and external context, and all requirements for internal and external parties, to identify and scope an information management policy that is imposed by top management.

Roles and authorities are then identified to support such requirement, and all resources shall be allocated. Communication and awareness shall take place to support the policy, and to make sure everyone is aware of his role and responsibility.

Risk Analysis will identify risks that should be treated according to a well documented plan, and any non-conformity shall be addressed by a corrective action. Monitoring and measuring then will ensure that the information management objectives are achieved and an audit program is to be established to report the effectiveness of such information security system.

Most organizations are actually doing this without realizing. Suppose that an organization wants to enhance its business offering by allowing payments through credit cards online, an external party in this case is the PCI compliance certification that is required to support e-payments online. This is what we call External Party. The customer want to have an option to pay online so the customer is called Interested Party. Top Management wants to enable such payment method, so it will allocate resources and communicate this new payment option with a commitment to enable secure payment to customer (this is sort of a policy that shall be communicated). From there, risk analysis will take place to ensure credit card information is kept safe and secure, and perhaps an internal audit program will validate those confidentiality measures to top management.

What is interesting about this topic is that Information Security is not an IT or technology concept, as it applies to information in all its form. It could be the knowledge in people’s head or in a piece of paper. Most people think information security is an IT topic, while it has a much bigger scope. We can say that Information Security is big umbrella and IT security is part of it.

No Logged on Office Users are configured for Information Rights Management (IRM)

Hi, this is a short blog post to share with you an issue I faced with Office 365 and IRM (RMS : Right Management Service).

My mailbox is hosted in Office 365, and Outlook 2016 is showing “No Logged on Office Users are configured for Information Rights Management (IRM)” error when I try to use IRM.

IRM and Outlook and Office 365

This is a reported issue and Microsoft has a registry fix that you can apply to your outlook to solve this. Here is the fix in this link.

By ammar hasayen Posted in RMS Tagged

Configuration Manager 2012 R2 not updating Heartbeat DDR and Hardware Scan info

Suddenly, and without any introductions, Configuration Manager 2012 R2 is no longer update the information of Heartbeat DDR and Hardware Scan fields. On the other hand, machines are sending hardware invntory and communicating in a healthy way to the configuration manager. We can also see updated hardware inventory for all machine. The only thing, is that these two fields are showing a very old date. This can be harmful if like me you are using these fields for various reports.

After opening a case with Microsoft, the support engineer confirmed that the configuration manager implementation is healthy and everything is working fine. After several escalations, Microsoft solution was to create a SQL JOB that runs every day that execute a built in stored procedure called exec_CH_syncClientSummary. The solution worked like magic.

Configuration Manager 2012 R2 not updating 3433

SharePoint Workflow and PowerShell Report

Hi everyone,

If you have ever worked on SharePoint workflows, you will notice that sometimes your task list gets crowded with many entries, and you want a way to query that task list to get some information, like number of pending tasks, tasks overdue, or any other kind of information.

For example, I may want to create a PowerShell script to query a specific task list, and send weekly report to each user about his pending tasks via email in a nicely formatted HTML table. While SharePoint workflow task actions can send overdue emails, it will do that per task, and not by aggregating all pending tasks per user.

People do not want to get email for each pending task, they want instead to have a weekly or daily email showing all their pending tasks. You can do anything once you know how to interact with SharePoint lists using PowerShell.

I will not give you a ready PowerShell script to do all the magic, but instead, I will show you how to start doing that, and how to use PowerShell to get the essential data. You can then use your PowerShell skills to do whatever you want with it (Send email, create an HTML table, aggregate data..).

First of all, you must load the SharePoint snap-in inside your PowerShell host environment.

Add-PSSnapin Microsoft.SharePoint.PowerShell -erroraction SilentlyContinue

Next, you want to connect to your list:

$weburl = ""

$web = get-spweb $WebUrl

$list = $web.GetList("")

here my SharePoint site is while my list URL is the one I used in the last command.

Now that we are connected to the list using $List variable, $List is an array of all items in that list. So for example, to get the first item of the list:

$FirstItem = $List[0]

This will return to us the first item of the list, and since we are talking about task list, we can expect this item to have all the columns in a typical task list.

To get this tasks’ due date:


Now we can go and get task assignee for this task. This can be tricky as this field is of type (People or Group), and you may need the user/Group ID, or perhaps the email address in case you want to send them email notifications about their pending tasks:

$myField=$FirstItem["Task Status"].ToString()
$userfield = New-Object Microsoft.SharePoint.SPFieldUserValue($web,$myField.ToString());

This will give you the DisplayName and email address for task assignee. Although it says User.DisplayName and User.Email, even if the task assignee is a group, it will bring the group email and displayname also.

Finally, you can also do some cool stuff. Instead of getting all the task list items and store them in the $List variables and then looping through those tasks, it is much easier if you only get the items you want. Suppose you only want to process tasks not completed and due. To do this, you will use a CAMEL query to query only those tasks. Here is an example:

#We will set the Enddate to today
$EndDate = [Microsoft.SharePoint.Utilities.SPUtility]::CreateISO8601DateTimeFromSystemDateTime([DateTime]::Now)

#we will define a query that will get only tasks that are NOT Completed and DueDate less then Today
$caml='<Where><And><Neq><FieldRef Name="Status"/><Value Type="Text">Completed</Value></Neq><Lt><FieldRef Name="DueDate"/><Value Type="DateTime">{0}</Value></Lt></And></Where> ' -f $EndDate
$query=new-object Microsoft.SharePoint.SPQuery

#Then will only get list items matching the query we did

Now the sky is the limit. You can do any kind of crazy thing using the above commands🙂 Enjoy.

SharePoint Workflow : You write a code that is too long to upload!!

I want to share with you a funny situation while writing extremely long and complex SharePoint workflow. I was creating a workflow that controls deployment requests and there is a requirement to throw an approval for 9 stages. After each stage, there is couple of email notifications and web services to call.

The code got extremely big that when I tried to publish my workflow code inside SharePoint designer, I got this message:

“The request message is too big. The server does not allow messages larger than 2097152 bytes.”

It seems I reached the maximum allowed size for a workflow code definition🙂. After opening a case with Microsoft, we came up with couple of workarounds:

  • Divide the workflow to smaller workflows
  • Re-write the workflow code in a way that redundant code can be removed.
  • Increase the upload limit of 2097152 bytes.

I worked with Microsoft engineer to apply the third workaround by applying the following PowerShell code on all SharePoint front end servers, and doing IIS reset after that:

$ws = [Microsoft.SharePoint.Administration.SPWebService]::ContentService

$ws.ClientRequestServiceSettings.MaxReceivedMessageSize = 5120000

$ws.ClientRequestServiceSettings.MaxParseMessageSize  = 5120000


This will increase the upload limit to 5 MB instead of the default 2 MB. When I initially try to get the default value of the above values, I got empty values, so me and Microsoft support engineer assumed that empty value is actually 2 MB.

I hope this will help someone out there. Of course, the best practice is to do smart coding for workflows so that the code never exceed 2 MB for performance reasons as per my talk with Microsoft engineer.

Tip: How you can know the size of your workflow code?

Open SharePoint designer, Go to Workflows, pick your workflow, click Save As Template. This will save the template to “Site Assets“.

SharePoint Workflow Code is too long 23232

Next go to Site Assets,  you will see your workflow file with WSP extension, along with its size. This size does not reflect the actual size of your workflow code. Now go and click Export File, and save it to your desktop.

SharePoint Workflow Code is too long 233232

Now go to your desktop, and change the extension of the exported file to .CAB , then double click the file to open it:

SharePoint Workflow Code is too long 233378

SharePoint Workflow Designer Tips P5

View other parts:

I want to start this part by talking about the rule of abstraction. Again, it is best practice to not hard-code any values inside your workflow code. Your workflow code should read at execution time all needed configuration values from external source. This is what we talked about in the previous part of workflow designer tips.

There is another place where abstraction can be applied. Inside your workflow code, you may need to interact with user values and groups. Usually you interact with users and groups in two places:

  • When you assign a task, you usually assign it to user or a group of users.
  • When you send email notification, you send it to a mail enabled user or group.

First tip is to avoid working with users and instead work with groups. This is the first level of abstraction. When you work with groups, you can externally modify the membership of the group and the magic happens right away. Even if you are assigning a task for a single user, you can assign it to a group that contains only that user. Later when that user leave the company and there is a need to change the task assignee, you can just change the membership of that group.

The second part of abstraction is to create a separate SharePoint list called Workflow Subscriptions for example, that contains two columns:

  • Subscription. (Data Type: Single line of text)
  • Subscriber. (Data Type: Person or Group)

Now, you can fill that list with values you wish. For example, if your workflow is sending email notifications to a group of people upon finishing the process of a new hire, you could create for example a new SharePoint or Active Directory group called (New Hire Notification Group), that contains the people that should be notified upon a new hire, and then create an entry in the Workflow Subscriptions like this:

  • Subscription: New Hire Notification.
  • Subscriber: New Hire Notification Group.

Now inside your workflow designer, when you want to send email to notify people for a new hire, you do the following:

SharePoint Workflow Designer Tips P1 65554.JPG

The same applies if you want to assign a task to a group, you create an entry in the Workflow People list after you create a group called (New Hire First Approval Group):

  • Subscription: First Approval.
  • Subscriber: New Hire First Approval Group.

You can see that we talked in the previous part about the Configuration List and now we are talking about Workflow People list. The main difference between them is the data type of the columns. The Configuration list has two single line of text columns, while the Workflow People has one column with single line of text column, and the other is People of Group column. Together, these two lists will give you enough abstraction to remove any hard-coding from your workflow code, and makes your workflow code safe from direct changes.

SharePoint Workflow Designer Tips P4

View other parts:

In this part of the tips, I want to talk about abstraction concept. When you write a good workflow code and everything works just fine, you should not face a situation where you open the SharePoint designer to do some changes unless there is a major change in the logic of how the workflow works.

If you have a change management request list that is served by your workflow that requires three approvals, and you wrote the workflow code, you test your code, you put the workflow in production, and everyone is happy with it, then theoretically speaking, you should not have a case where you need to open the workflow code again. Even if one of the approvals is changed to someone else, this should not be a cause to change the workflow code.

This can be accomplished by introducing a level of abstraction, by not hard-coding values inside the workflow code, and by maintaining all configuration values in separate SharePoint list, that your workflow will read from in real time.

SharePoint Workflow Dashboard Tip 651681

So now, you shall create a SharePoint List called (Workflow Configuration) for example, with two columns:

  • Configuration Name (Single line of text)
  • Configuration Value (Single line of text)

You then start working on populating this list with any value that you may use in your workflow. Examples are:

  • If you assigning tasks, then you can put the task due date and task title in the configuration list.
  • If you are calling web service, then you can put the web service URL in the configuration list.
  • If you have any static values or counter values, you put them here also.
  • Any switch values, for example, you may have a variable that is named (SendNotificationEnabled) and if it is true, your workflow will send notification. You can read this variable from the configuration list. That way, you can change the variable value from the configuration list without opening the workflow code.

Inside the Workflow Designer, you can read values from the Configuration List:

SharePoint Workflow Dashboard Tip 4538

Just thing of this case with me. Your workflow is calling a web service, and you kept the web service variables in a confirmation list like we did here. If the URL of the web service is changed for any reason, then you can ask anyone to change it from the configuration list. No need for you to go and open the Workflow SharePoint designer, and do some changes, and hit the scary Publish button.

I usually do not keep any single hard-coded value in any of my workflow code. Usually, when I write a workflow code, I do not open the code again until there is major changes the affects the way the workflow logic is happening. Any other changes, or customization are all kept and maintained in a separate configuration list. This even include the subject of workflow email notifications.

Web Application Proxy P1

We all know that Microsoft is not investing in TMG and UAG the way it did before and those products are going out of support soon. On the other hand, information worker are now using Laptops and devices all the time and from everywhere.

The way i see it is that end users now have their BYOD devices, have a cool looking apps, and they want to use those cool apps to connect to corporate data without the need to RDP into the corporate network for example and use VDI or RDS Session Host experience.

On the other hand, while VPN and DirectAccess are great technologies for remote access, for IT administrators, those type of technologies will let the user IN or OUT. In other words, VPN solutions will either let you IN or OUT and it is hard to control where the user can go once he is IN.

Microsoft came with a new way for remote access called “Conditional Access”, and added flexible authentication methods to the solution. The solution is called “Web Application Proxy”.

Web Application Proxy (a.k.a WAP) is a new Windows Server 2012 R2 role service under RRAS server role, integrated into Windows Server Manager and RRAS admin experience.

Microsoft now announced another new category, which is conditional access reverse proxy called “Web Application Proxy”. This is not a new technology for Microsoft as they have TMG and UAG before. Whats new about this Web Application Proxy, is its unique interaction with Active Directory Federation Services and devices. Web Application Proxy is part of Windows Server and not a separate installation like TMG or UAG.

From Information worker perspective:

  • Access corporate apps from anywhere, on any device, Windows and non-Windows
  • SSO and native device/app experience

From IT Pro:

  • Selectively publish apps
  • control access per app, user, device, location
  • Better protection with pre-authentication (optional)
  • No change required in existing apps
  • No change required on devices (client-less)

WAP: Fundamental Services

  1. Reverse Proxy Services:
    • Network Isolation
    • Basic DOS : Throttling, queuing, session establishing before routing to backend
    • URL Translation
    • Selective Publishing: Per internal application endpoint
    • ADS Proxy Services
    • Web Protocols Only: HTTP,HTTPS
  2. Pre-authentication services:
    • Rich Policy : user + device identity, application identity, network location
    • MFA Options (multi factor authentication)
    • SSO

Network Topology

Web Application Proxy is usually located in your corporate DMZ with one network card or two network cars.

You can choose not to join it to your domain if you like, but if you want to use Kerberos constrained delegation method, then you have to join it to the domain.

Web Application Proxy can authentication requests before forwarding them to the back end applications (this is called Pre-Authentication), or it can just pass the traffic to the back end application without authentication (this is called Pass Through)

Web Application Proxy cannot live without ADFS becuase ADFS provides the following for Web Application Proxy:

  • Configuration Storage: WAP is a stateless firewall, and its configuration files are stored in the ADFS
  • Pre-Authentication: WAP uses ADFS for authentication.

Web Application Proxy 1

Web Application Proxy Part2

Relying Party

I want to start by defining (Relying Party). ADFS has many Relying Parties, and those are the systems or devices that trust ADFS for authentication. In our context, ADFS has three relying parties:

  1. Web Application Proxy itself is a relying party to ADFS, because it trusts ADFS for authentication.
  2. The LOB applications are relaying parties to ADFS, because they trust it for authentication.

Note: ADFS is called STS which stands for ” Security Token Service”

Suppose we have line of business application (LOB), and we have our ADFS that contains application policies, and we have the Web Application Proxy. The line of business application is accessible internally using http://lob. The ADFS URL is that is published externally via the WAP (Web Application Proxy)


What we will do is to publish the LOB app on the WAP using the fully qualified domain name and with SSL, and the WAP will send 302 Redirect response to to do the pre-authentication. the ADFS will authenticate the user and will send him a token


SharePoint Workflow History Data and Logs Tips – P3

We talked about the workflow log types (Audit and Debug), and we agreed that using the built in “Log to History List” action inside SharePoint designer is not a preferred way from my personal point of view. So what is the better way to do logging?.

You can think of a SharePoint list as a database table. It has columns and each column has specific type. Tables in a relational database model can relate to each other using keys. In SharePoint, lists can relate to each other using a column type called (Lookup) in the same way.


We have a process to log changes happening in data centers. We need to create a process to track those changes and require an approval for every change. Finally, auditors will require some kind of logs to be exported as a proof of the integrity of such change control process. You also need some debug logs for troubleshooting purposes.

Let us start designing the solution by defining couple of content types. If you are not familiar with content types inside SharePoint, I suggest you start learning about this concept. In my own words, content types are like creating a class in any programming language. You create a class, define some properties and the data type for each property, and then you can create instances of the class whenever you want.

Same thing applies here.Content Types are like classes, and Site Columns are like Class Properties. If you are still not that comfortable with content types, just create normal lists and columns, but in this example, I will use content types just because I like to do that.

We will create three content types “Parent Content Type shown in the picture”:

  • Data Center Changes
  • Audit Log
  • Debug Log

SharePoint Workflow Designer Tips P1 3

We will also create three lists:

  • Data Center Changes List.
  • Audit Log List.
  • Debug Log List

Data Center Changes content type has two columns:

  • Title (Single Line of Text)
  • Data Center Change Reason (Multiple lines of text)

SharePoint Workflow Dashboard Tip 32

Debug Log content type has two columns:

  • Debug Message (Type: Single line of text)
  • Change ID ( Type: Lookup) – maps to the Data Center Changes “ID” column.

SharePoint Workflow Dashboard Tip 33339

SharePoint Workflow Dashboard Tip 333391

The same thing with the Audit Log content type, it has two columns in the same way:

  • Audit Message (Type: Single line of text)
  • Change ID (Type: Lookup) – maps to the Data Center Changes “ID” column.

Now we have the following relation between our three lists:

SharePoint Workflow Dashboard Tip 99

When you are inside the SharePoint designer, and you want to throw a debug message, choose action (Create List Item), pick the Debug Log list, as per the following:

SharePoint Workflow Dashboard Tip 456

As shown, you have to populate the (Change ID) with (Current Item:ID), and then fill the (Debug Message) with the log content you want.

The same applies when you want to create an audit message. Finally, you can go to the Debug Log List and the Audit Log List, create a view with (Group By) using the Change ID field, and you will get all debug messages and audit messages per Change ID.

You can use SharePoint Information Management Policy to apply retention policy to purge the debug messages from the Debug Log List after say one month, and another retention policy to purge the audit messages from the Audit Log List after say one year.

I hope this highlight some of the great benefits from using such approach to create and maintain logs generated from your workflows.

SharePoint Workflow History Data and Logs Tips – P2

We have talked in part one that the workflow logs can be classified as Audit Logs or Debug Logs. Audit logs have long retention and are used by auditors and security teams as a proof of a controlled process, while Debug logs have low retention and are used by the people maintaining the workflow for troubleshooting purposes and to track the execution state of the workflow at different execution times.

SharePoint Workflow Designer gives you a built-in way to through some log data to a history list using an action (Log to History List). I am not fan at all of using this way of logging for different reason.

Let us talk a little bit about the Log to History List and about the History list itself. By default, there is a hidden list that get created by default when you first create your first workflow in a site called History List. You can use this history list for different workflows or you can choose or create different history list for each workflow.

One of things I do not like in such history lists is that it accept only single string at a time. There is no other columns in that history list that you can benefit from. Not that this is a big limitation, but I do not like to be restricted with only sending a string at a time. It makes it hard to filter and analyze the log data as I will described later because of the lack of other columns.

Second thing is that a workflow can be associated with only one history list to be used for logging. Take this example: You have a workflow that calls a web service, and you want to log the status code or perhaps the returned value from that web service call. The only option you have here is to use the Log to History List. You also want to log other events during the workflow execution. Now, someone came to you and ask you to give a report of all web service calls and their return code for analysis. You have to go and open that hidden history list from SharePoint designer, and then what you will see? You will see a lot of lines without any ability to filter the logs related to the web service calls and to track them back to the list item that cause the service to start. My point is it is very hard to look at the history list and track certain events, or do any kind of filtering.

Also we talked about two types of logs, audit and debug logs. Your only option here is to use the Log to History List and through logs related to auditing and other log entries (debug data) for you to troubleshoot the workflow. Now when the auditors ask you to extract a report for a certain item, the logs are mixed between audit and debug. Also, you may want to keep log entries used for auditing for longer time than those used for debugging, which you cannot do in this case because both are saved in the same history list.

Things become more interesting when you read about the “Workflow Auto Cleanup“. It is not best practice to disable this job or change its duration by the way. This job will remove the association of the workflow tasks and history data after 60 days by default. You can read more about this here. But what about the need to keep audit data for one year for example?. What will do then? People will disable this timer job, but Microsoft keep saying it will affect the performance of the product or something. If Microsoft implemented this timer job then there is a reason.

I think for a professional workflows, especially those that requires auditing, you should not use the built in way (Log to History List). I will describe the way I prefer in the next part.

SharePoint Workflow Designer as RDP APP

If you are responsible of writing SharePoint workflows using the SharePoint designer, I want to share with you a small tip when it comes to using the SharePoint designer console.

Usually, you have SharePoint servers and perhaps Workflow Manager in your data center, and you may have installed the SharePoint designer at your machine and connect remotely in order to start coding workflows.

SharePoint Workflow Designer Tips P1 233622

What I do not like in this case is the dependency on the link.  Sometimes you work remotely from a hotel room connecting to unreliable wireless network, connecting VPN to your corporate network, opening the SharePoint designer console from your machine and opening a very big workflow definition code, do some modifications and hitting Publish. You do not know what will happen if the network connectivity is not reliable.

What I prefer is to have a Remote Desktop server in the data center to do administrative tasks for different thing. You can then install the SharePoint designer in that remote desktop server, and then you can log remotely to that server and open the SharePoint designer from there. That way, when you hit Publish workflow, the changes will be pushed from the terminal server to that SharePoint farm without depending on any unreliable connection. Furthermore, I also have exported the SharePoint designer as a remote web app and copy it to my desktop. Whenever I want to use the designer, I just open the RDP file in my machine, which will connect using RDP to the RDS server in the data-center and give me a great experience.

SharePoint Workflow Dashboard Tip 4269

Even if you connect from your hotel room, connecing via unreliable wireless network via VPN to your corporate network, you will RDP to that RDS server, use the SharePoint designer from there, open your big workflow code, do your changes, hitting Publish, and you do not worry about anything. In fact, you can close your VPN connection and the background, the SharePoint designer will take its time publishing your workflow changes without any networking issues.

SharePoint Workflow History Data and Logs Tips – P1

I want to talk about metadata, SharePoint Workflow History logs, and how to use this data for different purposes. People underestimate this part when thinking about workflows, and just focus on how to do the workflow logic.

I usually classify the log data the comes out of a workflow into two types: Audit data and Debug data.

Audit data is the data that auditors ask for in order to validate the process integrity. If you have a solid change control policy in place, and you have a workflow in SharePoint to control a process, auditors usually ask for proof in form of workflow logs. For example, if you have a workflow to control the process of creating service account in active directory, auditors will come and will ask you for a proof that these accounts are created in active directory after passing through a workflow approval cycle. To do that, you can code your workflow to generate audit log data to be exported and shown to the auditing team once requested.

SharePoint Workflow Dashboard Tip 12342

Let me give you another example. I had a solution that is used to track new changes in a data center. To do that, I created a list in SharePoint and anyone can submit a change request that passes through couple of approvals. Auditors and security team will come every quarter and ask: “Give us a proof that any change in the data center is logged and passed through approval”. IT Admins will then go and export the Audit logs as a proof. Auditors also require audit data to be available for a full year for any change happening in the data center.

Audit data is used mainly for security reviews and should have long retention (one year for example). The main purpose for such log is only for auditing and proof that an approval cycle is in place to control the process actions.

On the other hand, SharePoint workflow logs can be Debug Logs. Debug logs are used mainly for the team working on writing and supporting the workflow. If you write a complex workflow, you would like to through couple of logs at certain points of the workflow life cycle to track what the workflow is doing and if it is working as expected.

Suppose you have a workflow that calls a web service, sends couple of email notifications, and start approval requests. For you, it would make sense to generate a log before calling the web service, and after you got the web response back, and perhaps log the response code, to make sure the web service call is working just fine. Also, you could log that you are trying to send email notifications now, and that a task is generated for this person waiting for his approval. All these logs are only used for you as a workflow programmer to track the workflow status and action at different points of execution.

Usually debug logs will have a very short retention, as you only want to keep such logs for a month in most cases.Auditors and security team do not care about these log messages.

In the next parts, I will start talking about how to plan these logs and why I personally do not like to use the built in workflow history for this purpose.