A quick script for logging into Office 365 with Powershell

Connecting to Office 365 for management is pretty straightforward, but Microsoft does make changes to its service names on occasion.  If you have multiple Office 365 tenants it can also be a pain to keep them all straight.  This tiny script can be saved as a .ps1 file and modified for each account to make logging into Office 365 easy.

This uses input directly into the Powershell window for the password of the connecting account.  You can add this as a variable but I'd recommend against it since it means a password is stored in plain text.  I will cover encrypting and secure strings in another blog post.

I add Get-MSOLCompanyInformation  to make sure I am connected to the right tenant space.

This topic has been covered before of course, this blog just pulls together the most recent ConnectionURI along with snippets from Technet an other blogs.  You can see them here:

http://jeffwouters.nl/index.php/2012/12/keep-your-powershell-script-open-when-executed/

https://technet.microsoft.com/en-us/library/jj984289(v=exchg.160).aspx

https://blogs.technet.microsoft.com/mitchelatmicrosoft/2014/12/22/connecting-powershell-to-your-office-365-tenant-to-manage-exchange-and-azure-ad-simultaneously/

param ( $Show )
if ( !$Show ) 
{
    PowerShell -NoExit -File $MyInvocation.MyCommand.Path 1
    return
}
Import-Module MSOnline
$username = "ADMIN-ID@DOMAINNAME.TLD"
Write-Output " Enter Password for account " $Username
$password = read-host -assecurestring
$objcred = new-object -typename System.Management.Automation.PSCredential -argumentlist $username,$password
$Office365Session = New-PSSession –ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $objcred -Authentication Basic -AllowRedirection
Import-PSSession $Office365Session
Connect-MsolService –Credential $objcred
Get-MsolCompanyInformation

Remember you need the latest prerequisite software on the system you are using to connect.  Staying on the latest version is critical as Microsoft is constantly changing their infrastructure.

Sign-In Assistant:

http://www.microsoft.com/en-us/download/details.aspx?id=39267

Windows Azure Active Directory Module for Windows PowerShell

http://go.microsoft.com/fwlink/p/?linkid=236297

Hitting a message about script execution policy?

Set-ExecutionPolicy RemoteSigned

Microsoft Direct Access

In today’s day and age, being able to provide employees with the ability to access files, applications, and internal websites from anywhere is vital. This is true whether you have employees who telecommute, who are on business trips, or who simply need to keep up with business while away from the office. Microsoft Direct Access provides this in a very simple way.

What It Does

To put it simply, Microsoft Direct Access gives remote users access to a variety of internal files, applications, and websites – all without the need to connect to a VPN. Essentially, the goal is to allow the remote user to connect directly to the intranet (company’s private network) without difficult tweaking of settings. When enabled, Microsoft Direct Access will automatically create two-way communication between the computer and the internal network each time a computer with the software enabled connects to the internet.

This means that the end user doesn’t have to think about logging on. The IT administrators can also manage the computers that are set up with Microsoft Direct Access remotely, even if the computers are not connected directly to the VPN.

Tasks Available from the Direct Access Management Console

The Direct Access management console gives IT and network professionals all of the tools they need to set up the proper infrastructure for their companies’ unique needs. With it, they can:

·         Identify which servers housing applications should prompt users for authentication, which is optional;

·         Identify the servers that are part of the infrastructure, including their location on the network and DNS;

·         Configure DNS names that the internal servers must resolve;

·         Configure the location of internal websites so that computers with Direct Access enabled can determine when they are on the network;

·         Configure all of the individual network adapters on the server as well as things like certificates necessary for authentication; and

·         Specify which computers can use Direct Access by setting up and then assigning unique security groups.

IT professionals and companies can also monitor Direct Access so they can see the entire infrastructure running on a particular server.

The Primary Benefits

Aside from providing a host of benefits to IT professionals and others who are responsible for setting up and maintaining a local, private network, Microsoft Direct Access also provides several unique benefits to each end user. Most similar remote access programs require each end user to start and terminate a VPN connection. This required the configuration of individual computers, and security often depended upon the end users’ ability to remember to terminate the connection at the end of the session. The best part is that it is based on Microsoft technology, but it can also be implemented on UNIX and Linux servers, too.

Whereas traditional VPNs come with several downfalls, including the need to deploy and maintain software and to attempt to configure finicky firewalls, Microsoft Direct Access alleviates some of those problems. It essentially takes the place of the VPN, plays nicely with firewalls, and automatically connects end users to a company’s intranet – all with no additional action necessary on the end user’s part.  

Differences Between Backups and Business Continuity

Chances are good that backups are a very important part of your company’s IT strategy. They’re vital for many reasons, including the prevention of valuable data loss. Although backups can be (and often are) a valuable part of business continuity, they are not one and the same. Understanding the difference between backups and business continuity can help you make better decisions about your company’s IT needs.

The Benefits and Limitations of Data Backups

A data backup is redundancy in its simplest form. Imagine that you run a small business, and you store shipping and billing data for about 5000 customers. A data backup is just that – another source where data is stored, either physically or on the cloud, so that you can retrieve this vital information in the event that your hard drive or server crashes. This is certainly a wonderful thing. You won’t lose your customers’ information, and you can continue running your business – to a degree.

Now, imagine that you have a server hosting your company’s website, which provides your customers with access to the service they pay for. You can certainly back up your server data, but if that server crashes, what happens to your website? It gives users an error, those users can’t access it, and business can’t continue. How many customers will want refunds, or at least partial refunds? How much money will an outage – even if it’s just for one day – cost? That’s where business continuity strategy comes in.

The Need for Business Continuity

Business continuity ensures that if something happens to your main server, your business will still continue as usual with minimal downtime, if any at all. These days, people have come to expect continuous uptime, and if you can’t provide it, then your clients and customers will likely find a new service provider in relatively short order. That’s why it’s important to put a continuity strategy, complete with data backup, into place.

Business continuity requires planning and constant testing of metrics. You’ll need to device and develop things like recovery point and recovery time objectives (RPO and RTO, respectively) to understand how quickly you can recover data and how much data is truly at risk. When you understand these things, being able to provide continuous service, even in the face of a disaster or complete hardware failure, becomes possible.

Putting the Two Together

With all of this in mind, data backup is and always should be a very, very important part of your business continuity strategy. After all, if you lose your company data, you can’t operate. Just keep in mind that actual business continuity is about far more than just scheduling backups. You can consider a variety of continuity options ranging from cloud storage to colocation, and each comes with its own unique set of benefits. You’ll find options that are affordable, flexible, scalable, and above all else, reliable.

Backups and business continuity are both very important to your company’s overall success, but they are not the same things. Business continuity requires careful planning on a variety of fronts, including things like disaster recovery. When you have all of your bases covered, providing uninterrupted service to clients and customers is far less daunting. 

Why are so Many Businesses Still Scared of the Cloud?

Just because everyone is moving, should you?


Cloud computing involves using a network of servers that are hosted online to store, process and manage data instead of using a computer’s hard drive or a local server. While this practice has become immensely popular over the past few years, many businesses are still afraid of changing over to using this system to store their confidential data for various reasons. 

Security Concerns
Many businesses want to know if it will be possible for other parties to access their data if it has been stored on the cloud. However, in most cases, cloud hosts will be using the most up to date firewalls, security patches and load-balancing systems possible to ensure that everyone’s data remains as safe and confidential as possible. Moving information and work processes to cloud-based solutions will enable on-site IT departments to focus more on overseeing on-site security protocols.


They’re Afraid of Losing Control
When a company’s computer network goes down for any reason, the IT department usually bears the brunt of everyone’s frustrations. As a result, these professionals end up feeling somewhat apprehensive when it comes to handing over responsibility for ‘their networks’ to a company or third party that they don’t really know. They may wonder who deals with making decisions pertaining to who decides what hardware and software to purchase or what action will be taken in the event of a network issue or hard drive failure occurring. 
 

Concerns of Being Tied to One Provider
Another concern voiced by many businesses regarding cloud computing is that they are afraid that they will be tied into using one particular service provider because they may have been required to sign a contract or their software has been developed around a particular form of architecture. They want to know that they will have the option of changing providers in the event that they experience bad service in any way. However, this should no longer be a concern for businesses, as most cloud providers are normally willing to provide short-term or even pay as you go options, which provide users with the flexibility they require.

At VEI we think there is a right solution for everyone.  Sometimes the cloud works for your data and business model and sometimes it does not.  A dedicated server farm, virtual teams, experts on-staff or service contracts with vendors are all good solutions.  The cloud is maturing and many of these concerns will be addressed in the next couple years.  If you want to know more, talk to us.  We have experience with moving organizations to Google Apps and Office 365.  We are also one of the few companies that performs back migrations; migrating from the cloud back to on-premise.

How Businesses Lose Productivity by Not Using Collaboration Tools like Shared Documents


Many businesses seem reluctant to implement the use of collaboration tools because they feel that it will present too much of a learning curve and waste time. However, this could not be further from the truth because they have in fact been developed to help businesses and employees work smarter and get more done each day. There are many ways in which the refusal to use collaboration tools can affect productivity levels.

Projects Take Longer to Complete
When collaboration tools like shared documents are not being used, it means that employees have to partake in face to face meetings, deal with endless email chains or endure long telephonic conversations in order to communicate with each other. This is not only frustrating for employees; it can result in projects taking far longer to complete than necessary as well. However, when using an online shared document system, everyone who needs to have access to specific information can access it at the same time.

>>>See our blog post on configuring Google Apps automation >>>>
 

Accountability Levels are compromised
In cases where a single piece of information has to be passed from one employee to another instead of being made available in a central location, it increases the chances of employees blaming logistical issues when their productivity levels have not been up to standard. For example, if they have not done their share of work for a project and this is queried, some of the most commonly used excuses for this are usually, “The email hasn’t arrived yet,” or “I’m still waiting for them to call me.” Using cloud-based shared document systems can prevent all of this from happening.

Quality of Work is affected
Employees who have to spend hours requesting and searching for information can become frustrated and disheartened, which will in turn affect the quality of work that they turn out. However, when everything they need to complete a project has been collaborated, it will enable them to get the job done quicker and easier than before. This can also result in work being completed late and/or being of a poor quality, which will have a negative effect when the time comes for clients to decide whether they want to continue working with a particular company or not.

Money is Lost
In addition to overall productivity levels being affected, businesses who have not embraced the use of collaborations tools and software could find themselves losing a lot of money. This is especially true in cases where they may be making use of freelancers – who normally charge a fixed hourly rate for their services (or part thereof). Storing the information needed to complete projects in a single location that is easily accessible may cost a little money each month, but the long term savings that will be experienced will far outweigh this cost.
If you are a business owner who has become frustrated because projects are taking too long to complete as a result of information not being readily available to everyone who needs it, contact us to find out how you will be able to benefit from our collaboration solutions.

 

 

February 2016 Microsoft Patch and Update errors

While updating one of our managed services customers, Nick noted an error while patching.  These were errors generated on a Windows 2012 R2 domain controller that is normally a rock with the cleanest event logs ever seen.  

Event ID 30 was logged during system reboot.  While concerning, the error cleared during the next reboot.  The patch may have introduced a race condition which resolved at the next startup and shutdown cycle.

February 2016 had some hiccups.  There are additional issues noted in this article:

https://redmondmag.com/articles/2016/02/16/security-patch-woes.aspx

Configuring GAM with Google Apps

Configuring GAM with Google Apps

While GAM is a great tool, it can be a bit challenging to setup, especially since Google made some look and feel changes to their developers website.  This post will briefly walk you through setting up GAM with Google Apps.

You need to register the applicaiton with Google and activate the right API before GAM will function.  The details of this are here on the GAM support Wiki.  However, there are a few twists and turns.

Read More

Two Factor Authentication with ADFS

Multi Factor authentication is way of ensuring that your users are who they say they are.  It can be highly effective at mitigating phishing attacks, password guessing and an insecure password policy.  In Windows 2012 R2 the capabilities for two factor authentication are available out of the box.

Note:  Users don’t always love two factor authentication because of the extra steps involved.  However, individual users can be selected for MFA or you can specify individual services so every login does not require waiting for notification. Contact Us

If you are using devices for the second factor, and that is normal approach, you need to prepare the Active Directory implementation first.  Run this command to update the forest to support device authentication.

Initialize-ADDeviceRegistration

The ADFS server is the provider of MFA.  Consider how people will access the the system internally and externally.  The recommended solution is to install ADFS proxy servers (Web Applicaiton Proxy) in the farm.  ADFS will function with other firewalls and load balancers as well.  Do not install an ADFS server outside your network.

 

Note: External Users need externally resolvable names so DNS and port planning is important.  You will also need a certificate trusted by a 3rd party authority for configuration. See our blog post about correctly configuring a certificate.

Requesting a certificate for ADFS

Differences in version 3.0 SSL certificate request

Configure ADFS to allow Multi-Factor Authentication after you have configured the basic server farm.

Select Authentication Policies and configure the options.  The Global Settings under Multi-factor Authentication is where the changes are made.

image004.png

If you are using Device Registration (allowing you to take advantage of Workplace Join) you need a custom DNS Alias

Enterpriseregistration --> the Host Name of your ADFS Server

image005.png

If your users will be using MFA outside the internal network, the DNS entry above and the server host names need to be accessible from the public internet.  Publish the changes to your external servers as well.

Note: make sure that your certificate request takes into account any external names.




Microsoft keeps improving Hyper-V. Is moving away from VMWare becoming more compelling?

Is moving away from VMWare becoming more compelling?  Microsoft continues to make improvements in the Hyper-V offering and infrastructure stack.  Windows 2016 looks like it will bring more capabilities and improved management.  This is great news for organizations that need to divest themselves of expensive annual support costs or are considering going to the cloud for cost reasons but want to keep important data close to home.

  1. Rolling upgrades for Hyper-V and scale-out file server clusters for faster adoption of new operating systems
  2. Functionality for hot add and remove memory and NIC, reducing downtime
  3. Virtual machine compute resiliency, so that virtual machines continue running even if the compute cluster fabric service fails
  4. Nano Server, a deeply refactored version of Windows Server with a small footprint and remotely managed installation, optimized for the cloud and a DevOps workflow

See more at: http://blogs.technet.com/b/windowsserver/archive/2015/05/04/what-s-new-in-windows-server-2016-technical-preview-2.aspx

Adding a Certificate to ADFS 3.0 on Windows 2012 R2

ADFS only allows imports through the GUI in the .PFX format.  To get around this, add your certificate to the Machine’s Personal Store before configuring the first server in the farm.

Hit WINKEY+R to get to the Run line.

Run Certlm.msc

Select the Machine Personal, right click and Select All Tasks > Advanced Operations > Custom Request.  See our blog entry on correctly formatting a CSR.

 

In ADFS 3.0 you can support a Workplace Join after 2012 R2 domain controllers are in place.  If you are going to support this feature, you need to add a Subject Alternate Name (SAN) to your certificate for ADFS:

enterpriseregistration.YOURDOMAIN.TLD

The name has to be “enterpriseregistration” without quotes.  This is a canned name like autodiscover for Exchange.

2012 R2 Certificate request

 

Click Properties

Certificate request properties 2012 R2

An Example of the ADFS 3.0 Certificate request settings:

CSP and Key Length for 2012 R2 certifcates
I normally recommend a larger key size than the minimum.  The defacto minimum these days (2015) is 2048 bits but this is only due to industry players like Microsoft and Google forcing the issue.  Many certificates still rely on 1024 bit keys.  Given that the industry will continue to probably move to larger keys as a panacea for insecure protocols, a bigger key may save you from having to replace the certificate before its lifetime expires.  What about the CPU overhead of a larger key?  Modern systems with 64 bit architecture and multiple cores will be largely unaffected by the minimal increase in CPU overhead.  If you are doing this on a machine that would be affected, Windows 2012 R2 probably should not be installed on it anyway.

After you receive the certificate, install it on the server with ADFS 3.0

Installing the certificate on ADFS 3.0 on Windows 2012 R2

If you will be creating a second machine in the farm, you can export the certificate via a .PFX file.  Make sure to export the private key.  The export procedure is the same for ADFS 2.0.  





The value of Proactive Monitoring

Focused System Monitoring

We monitor the performance of each server, including memory, CPU utilization and storage.  If there is an outage or decrease in performance (such as a full hard drive or spike in CPU use) we notify you immediately and make our support team aware of the issue.  We also track the performance and provide historical graphs and logs, plus weekly reports.  This affordable service offering gives an organization a good general picture of their server health and historical records can be crucial in budgetary planning and expenditure justification.  System monitoring has optional Comprehensive Incident Response which frees up your IT staff from daily firefighting.

We also provide this service for your mission critical desktop systems at a reduced price.

 

Proactive Application Monitoring

Servers provide critical applications to your end users such as email, databases or directory services.  Application performance is not always directly tied to system hardware performance.  For example, Exchange servers will always have high memory utilization without incurring a performance penalty.  Conversely, they can grind to a halt under a heavy spam load without spiking CPU utilization.  in another example, Active Directory can experience replication errors due to network issues which are outside the server but that directly impact your users ability to logon.  Our intelligent agents monitor the internal performance of the application independent of the server's general performance.  In all three cases, you can see a potential issue before it gets out of control.  Combined with User Experience Modeling, this allows us to provide a constant health check for your critical applications and historical data for planning and incident response.

Proactive Application Monitoring includes Focused System Monitoring. In addition to weekly reports and trending data, Proactive Monitoring gives you access to real-time data in our customer portal and the option of Comprehensive Incident Response.

Comparing the MyISAM and InnoDB database engines for mySQL

Databases are almost always used when building applications, whether they are web applications or native applications. Choosing an appropriate database engine is a critical step in the design and planning stage and should not be overlooked. A database engine (sometimes called a storage engine) is the underlying software in a database management system that takes care of creating, reading, updating, and deleting data. This article will be comparing two engines that are commonly used with MySQL, MyISAM and InnoDB. For those unfamiliar with MySQL, it is an open source relational database management system (RDBMS) developed by Oracle. As of June 2013 it is the most widely used open source RDBMS.

Let’s look at MyISAM first and contrast it to InnoDB since it is the default engine for MySQL 5.0 and offers several benefits. When setup correctly and conditions are ideal MyISAM is extremely fast. It also offers full-text indexes which are great for applications that need quick, accurate search functionality. MyISAM tables are also very simple thus being easily learned and understood. You may be thinking to yourself “Why would I want to use anything other than MyISAM? This sounds like the perfect engine!” However, its speed and simplicity comes with a few major drawbacks...



How do you ensure information security in an uncontrollable environment?

How do you ensure information security in an uncontrollable environment? Years ago, information security was equated with information control and control of access was the principal goal of most IT organizations.  From my perspective, while there was a heavy emphasis on Access control and Auditing, Authentication was thought of as usernames and passwords only.  Today, most institutions don’t have control of their user hardware environments and with cloud services and outsourcing, are losing control of their server environments. But is control really security?  If it is, do you have security if you lose control? While every institution must follow any legal or professional information security mandates and best practices (patching\AV\physical restrictions etc), rearranging your security posture can change the equation and make security more manageable.  Consider the following suggestions to reduce the noise and change your IT attack profile.

Make passwords long, unique and complex but stop making people change them constantly.

People hate being forced to change passwords.  Passwords are a fact of life and they will be a cornerstone of everyday IT management for decades to come. Long and complex passwords reduce the chance that users will have the same one for work that they do for their personal life.  In reality, most people use the same password for everything no matter what security people tell them.  A long, hard to crack password that people actually know is arguably more secure than PASSWORD123.  A good solution to this conundrum is a Passphrase instead of a password.  The best is 2 factor authentication.

Monitor your systems

Utilize and external monitoring system to keeps tabs on system performance and status.  This allows you to respond more quickly if you do have an incident and critically, gives you an offsite set of data\logs that can be used in the aftermath or for auditing purposes.  A systems monitor solution shifts the burden of watching systems off of IT staff so they can focus on other projects (like the ones in this post) and allows you to be proactive instead of reactive to system health.

Configure a guest network for Wireless Access

Visitors, guests and temporary workers should not have access to your network.  Period. Allowing guests on your wireless bypasses all your authentication and access planning.  Setup a guest network that has only internet access, ensure it has a password, is encrypted and kick users off after some predetermined amount of time.  If they need more access than that, they should go through some sort of onboarding and institutional vetting.  Solutions like Sharepoint file sharing and One Drive can allow collaboration without direct file server access.

Segregate your high priority data

This is a difficult step but is very helpful.  Would you leave all your valuables scattered around your house during a large gathering or would you segregate them, putting them in a safe or special place?  Treat information the same way.  All the files that a temp is using for a marketing mailing are probably not as important as the bank account numbers stored by the CFO.  Consider physically keeping critical data on a dedicated infrastructure (servers\network) that can be more carefully monitored and maintained.

 

BYOD and Universities

Choosing the Right MDM for Your School Bring Your Own Desktop and Mobile Device Management are transforming IT.

BYOD initiatives have changed the landscape of IT in schools and a functional MDM is crucial. However with the ever changing interface of MDM, and because many schools already have limited resources when it comes to IT, keeping up to date with optimal management techniques can be daunting and hard to scale. According to a May 2013 Aberdeen Group survey of 320 IT organizations, 75% had a BYOD program in place, but half of those were taking an "anything goes" approach to managing the mobile ecosystem. With that in mind, choosing the right set of tools to facilitate MDM in schools while software vendors continue to add new features every few months remains one of the primary challenges facing network administrators.  Recent improvements to Systems Center Configuration Manager (SCCM) have made things even easier for Microsoft shops while Apple has improved its management software.

An article written by Computer World, confirmed that when it comes to MDM, 2014 was the battle of the big vendors. “It is the year they will make a run at enterprises that want stability and scale.” The article continues predicting that MDM will morph from peripheral issue to core IT concern as the year goes on. Now that we are into 2015, the latter is certainly true – especially for schools.

 

There is an entire set of policies that have been developed depending on the institution – for example businesses can configure and manage devices in the same way that COPE (corporate owned, personally enabled) phones have been containerized. For schools the process is a little bit different, but the idea is the same. Integration across tools should be a primary factor including a unified management layer. Integrating 5 or 6 products is hardly sustainable and largely a single solution is better for security purposes.

 

When choosing a MDM, it’s best to look for the top suite rather than the best breed. Consider the way features are delivered and be mindful that the level of integration within a suite which can vary. Vendors may have developed most capabilities natively, but many have acquired features through acquisition, or have added them through partnerships.

 

In an effort to make scalability more functional, Apple released improvements to their DEP (Device Enrollment Program) in February 2014. Prior to the update, administrators rolling out large iPad installs reported Windows to have better remote installation and configuration support. The release was said to address that issue, giving both enterprise and education programs would have support for MDM hands-free configuration eliminating the need to cable up every deployed device and install a profile via Apple’s Configurator utility.

 

Regarding schools specifically, Apple also had trouble with students deleting enrollment profiles from their devices in order to access more of the web, including unapproved apps. Along with the updates that should prevent this from happening in the future, Apple has opened up the ability for students under the age of 13 to get an Apple ID. Once a school has enrolled in the DEP, they can request IDs from Apple, who will then send a communication to the parent, who will then be guided through the registration process. The school is then notified that the student has been given consent. These types of changes have the ability to make deployments scale up to massive numbers especially within educational institutions and enterprises.

 

While there has been word that Apple has struggled with the functional scalability of their program, the other option is the Windows SCCM (System Center Configuration Manager) plug-in. Long before Apple’s  DEP, Microsoft had developed the SCCM plug-in which allows end users to search applications via a self-service Software Center. IT administrators are also given the control to define when upgrades and installations take place in addition to installing different applications on different devices.  The services enable secure and scalable software deployment, compliance settings management, and comprehensive asset management of servers, desktops, laptops, and mobile devices spanning across Windows PCs, Macs and Unix/Linux Servers on premises along with cloud-based mobile devices running Windows, Windows Phone, Apple iOS, and Android.

BYOD has put an end to “the user” as the driver – so before deployment of an MDM, a full consideration of which suit best meets the need of your organization is necessary. MDM is finally maturing to a point where many of the kinks are being ironed out – but with the rate at which technology is moving forward, agility should continue to be a primary concern for schools and enterprises alike.


 

How to Configure SSL Certificates for ADFS 2.0

The single most important step when correctly configuring ADFS (Active Directory Federated Services) is the SSL certificate.  This is true if you are using it for Office 365 or for any other purpose.  You should be installing ADFS on a Windows 2008 R2 server and it should be fully patched.  From the server that will be the primary ADFS server in the ADFS server farm you need to create the CSR.  You do not use the IIS certificate manager.  The certificate can be generated via certutil.exe  or the Exchange commandlets but the GUI (Graphical User Interface) is the simplest approach for many people.  Don’t use a self signed certificate or you will be cleaning up a mess when you finally move things into production.

VE Industries specializes in single sign on, ADFS, Azure, Office 365 and Active Directory.  We can help you with your ADFS implementation.  Contact us and we are happy to assist you.

Creating the CSR

To generate the certificate CSR (Certificate Signing Request) for ADFS (Active Directory Federation Services) you have to use the certificate manager MMC (Microsoft Management Console) snapin or run certmgr.msc.  This will open the certificate repository.  Right click on the Personal store and select All Tasks, Advanced Operations, Create Custom Request.  This will start the wizard.   Click Next and then overcome the first challenge.  In the Certificate Enrollment Policy screen, click and highlight Proceed without enrollment policy  

Change the Template Option to Legacy Key

The next screen is where the details become important.

Settings for ADFS 2.0 SSL certificates

An ADFS 2.0 SSL certificate has a couple of critical settings.

  1. The URL of the ADFS server must be set as in Subject Name of the certificate and should be set as a common name or CN.  That means the veindustries.com implementation would be fs.veindustries.com and the format of the subject name is CN=fs.veindustries.com.  You can utilize a SAN certificate (Subject Alternate Name certificate) if you like to cover the other server names but the Subject Name on the certificate will become the service name in ADFS so don’t mess it up.
  2. The Key Length must be 2048 or higher.
  3. The Private Key must be exportable.
  4. Don’t set the Subject Name be the same as the server.

Configure the certificate via the Properties before clicking Next. Add the subject name and any other server names using the Directory Name type.  I usually set the Friendly Name as the DNS name of the cert so it can be tracked easily later.  Set Server Authentication and Client Authentication in Enhanced Key Usage.  Update the private key and the key length as well.

Installing the Cert

After you click OK, you can move on to the export of the key.  Upload the CSR to the your favorite CA.  When you install the cert you can continue with the ADFS configuration.  Based on a quirk with permission on private keys and how Microsoft does the certificate requests and storage, you may receive an error such as an Event ID 133.  See http://technet.microsoft.com/en-us/library/adfs2-troubleshooting-federation-service-startup-and-shutdown-problems%28v=ws.10%29.aspx .  The ADFS service account needs permissions to read the private key and the private key needs to be in the same store as the certificate.  Let us help you!

Getting started with IaaS on Microsoft Azure

Hosting replica domain controllers in the Azure cloud is one of the most compelling reasons to extend your on-premises Active Directory.  A replica DC is nothing more than another domain controller that is located on the distributed Azure network.  Just like a local environment, it requires a dedicated VM and reliable network connectivity to the other domain controllers in the domain and forest.  All the configuration was done on Windows 2008 R2. The secret sauce that allows your local network to connect to the Azure network is the point to site or site to site VPN. This post will focus on the point to site VPN since it can be used regardless of the type of firewall or VPN device on your local network.  Microsoft is currently pretty limited with their site to site offering.  This link provides a supported list: http://msdn.microsoft.com/en-us/library/windowsazure/jj156075.aspx

Configuring a point to site VPN

A point-to-site VPN connects a single machine in your network, like a domain controller, to the entire virtual network configured in Azure.  It does this by utilize a certificate based VPN that has matching certs installed on the target machine and uploaded to Azure.  This connects your local DC to the cloud DC.  Of course, you still need to do the AD basics of configuring sites, assigning subnets and verifying replication.  The certificate can be self signed but needs a root certificate and its private key.  To make the connection you need to

  1. create the root cert
  2. create the client certificate
  3. install the client cert on the target machine
  4. Upload the root certificate to Azure
  5. Download the precompiled VPN client

To create the certificate you need the utility makecert.exe from the Visual Studio SDK.  When you have makecert installed, use it to create a root certificate and a client certificate with these commands:

makecert -sky exchange -r -n "CN=<RootCertificateName>" –pe -a sha1 -len 2048 -ss My

makecert.exe -n "CN=<CertificateName>" -pe -sky exchange -m 96 -ss My -in "<RootCertificateName>" -is my -a sha1

If you want to connect multiple point-to-site VPN connections, you can export the client certificate with its private key as a .pfx file.  Otherwise, you can skip it and just export the root certificate as a .cer file.  That .cer file needs to be uploaded to Windows Azure to create the VPN connection binary.

After uploading the certificate, Azure will churn for a while and then produce a ready to install network object that is preconfigured for your virtual network’s gateway and the root certificate you installed.  It actually works extremely well.  The next step is to install the package, go to your network adapters, right click and select connect.  You will be prompted for elevated privileges so that CMROUTE.DLL can update the internal routes on the server.

You can verify the new routes or check these with the old standby command “route print”

Once it connects you are all set!  You can see the data being transferred between the networks in the Azure dashboard and virtual machines running on Azure will be able to communicate with the point server.  Make sure to check those local firewalls if you are troubleshooting!

The Future of BlackBerry Takes a Turn with the Launch of BBM Channels

When BlackBerry first hit the consumer electronics scene in 1999, it was a game-changer. The device allowed people to stay connected, while mobile, to their businesses. BlackBerry’s strongest feature was its messaging and e-mail capabilities. The company continued to focus on these capabilities in its expansion, capitalizing on business oriented communications. While BlackBerry dominated the market for a while, its continued focus on its emailing and messaging prevented device developers from looking at other possibilities. In a sense, BlackBerry neglected the idea that consumers might have a need for alternative applications that phones could not yet provide. As BlackBerry’s market share has continued to fall dramatically there has been a lot of talk about its sale. But before decisions are finalized, both consumers and experts alike are asking the question: is there a compelling reason for a business to use BlackBerry? Would businesses be better off centralizing on a different product and operating system, such as a Windows Phone, or Apple’s iPhone and iOS operating system? Or is bring your own device (BYOD) the way to go?

Centralizing on another product, such as the Windows Phone or the iPhone, allows for companies to set clear expectations of what is acceptable to be done with the device. By centralizing to one device, employees will all be on the same operating system, and in addition, there is less room for security risks. The general consensus is that companies should look into which device is best suited for the business, taking into account privacy, security and specific software applications.

Of course, BYOD would allow employees to choose their own device, which would be ideal for individuals with a preferred product. Letting employees use the devices they are most comfortable with can greatly boost productivity and worker morale. “Mac people” feel most comfortable operating with an iPhone rather than a BlackBerry or another device and visa versa.

However, BYOD has its own share of problems when it comes to business related communications. One of the most critical aspects to a BYOD program is the security of the data on these personal devices. Many have expressed concerns about accessing sensitive corporate information available on personal devices. There is also the risk of malware infected devices connecting to the corporate network. Allowing employees to use their own devices can also be a distraction, as some may be inclined to use devices for non-work activities during work hours.

While BlackBerry has hit hard times, the once top tiered mobile innovators are not done yet. Early last month, BlackBerry announced that they would launch a cross-platform, BBM Channels. The cloud-based enterprise mobility management solution is designed with the tools to secure and manage personal and corporate devices. This new EMM solution will offer business mobile device and application management, as well as security standards and self-service capabilities for end users. The success of this new EMM could help alleviate some of the concerns with BYOD policies, as well as help BlackBerry get back on the path to success.

After the recent launch of the BBM Channels “Messenger App,” BlackBerry has seen more than 10 million users download the free App for both Google Android and Apple iOS. In a recent statement, Andrew Bocking, Executive Vice President of BBM at BlackBerry confirmed, "The mobile messaging market is full of opportunity for BBM. We intend to be the leading private social network for everyone who needs the immediate communication and collaboration of instant messaging combined with the privacy, control and reliability delivered through BBM." But can the success of the App guarantee a future for BlackBerry?

Although BBM Channels is now in beta testing, it’s unclear when the service will be more widely available, and, whether or not the profits will be significant. Of that, Bocking told The Morning Edition, "We continue to plan to evolve the service and keep making it more engaging and have more reasons why people will come back to use the service."  More than just a mobile chat messaging company, it’s possible that BlackBerry will seek long-term profits secure corporate and government communications, even exploring the acquisition of its own.