Hello all, Ned here again with this week’s conversations between AskDS and the rest of the world. Today we talk Security, ADWS, FSMO upgrades, USMT, and why “Web 2.0 Internet” is still a poisonous wasteland of gross.
Let’s do it to it.
- Security “compliance” and recommendations
- FSMO role OS mixture
- Active Directory Web Services warning event 1400
- USMT hardlink migration error “cannot find all distributed stores.: There are no more files”
- Black Hat
I am getting questions from my Security/Compliance/Audit/Management folks about what security settings we should be applying on XP/2003/2008/Vista/7. Are there Microsoft recommendations? Are there templates? Are there explanations of risk versus reward? Could some settings break things if I’m not careful? Can I get documentation in whitepaper and spreadsheet form? Do you also have these for Office 2007 and Internet Explorer? Can I compare to my current settings to find differences?
[This is another of those “10 times a week” questions, like domain upgrade – Ned]
Yes, yes, yes, yes, yes, yes, and yes. Download the Microsoft Security Compliance Manager. This tool has all the previously scattered Microsoft security documentation in one centralized location, and it handles all of those questions. Microsoft provides comparison baselines for “Enterprise Configuration” (less secure, more functional) and “Specialized Security-Limited Functionality” (more secure, less usable) modes, within each Operating System. Those are further distinguished by role and hardware – desktops, laptops, domain controllers, member servers, users, and the domain itself.
So if you drill down into the settings and tabs of a given setting, you see more details, explanations, and reasoning on why you might want to choose something or not.
It also has further docs and allows you to completely export the settings as GPO, DCM, SCAP, INF, or Excel.
It’s slick stuff. I think we got this right and the Internet’s “shotgun documentation” gets this wrong.
Is it ok to have FSMO roles running on a mixture of operating systems? For example, a PDC Emulator on Windows Server 2003 and a Schema Master on Windows Server 2008?
Yes, it’s generally ok. The main issue people typically run into is that the PDCE is used to create special groups by certain components and if the PDC is not at that component’s OS level, the groups will not be created.
For example, these groups will not get created until the PDCE role moves to a Win2008 or later DC:
- SID: S-1-5- 21 domain –498
Name: Enterprise Read-only Domain Controllers
Description: A Universal group. Members of this group are Read-Only Domain Controllers in the enterprise
- SID: S-1-5- 21 domain -521
Name: Read-only Domain Controllers
Description: A Global group. Members of this group are Read-Only Domain Controllers in the domain
- SID: S-1-5-32-569
Name: BUILTIN\Cryptographic Operators
Description: A Builtin Local group. Members are authorized to perform cryptographic operations.
- SID: S-1-5-21 domain –571
Name: Allowed RODC Password Replication Group
Description: A Domain Local group. Members in this group can have their passwords replicated to all read-only domain controllers in the domain.
- SID: S-1-5- 21 domain -572
Name: Denied RODC Password Replication Group
Description: A Domain Local group. Members in this group cannot have their passwords replicated to any read-only domain controllers in the domain
- SID: S-1-5-32-573
Name: BUILTIN\Event Log Readers
Description: A Builtin Local group. Members of this group can read event logs from local machine.
- SID: S-1-5-32-574
Name: BUILTIN\Certificate Service DCOM Access
Description: A Builtin Local group. Members of this group are allowed to connect to Certification Authorities in the enterprise.
And those groups not existing will prevent various Win2008/Vista/R2/7 components from being configured. From the most boring KB I ever had to re-write:
243330 Well-known security identifiers in Windows operating systems – http://support.microsoft.com/default.aspx?scid=kb;EN-US;243330
I hesitate to ask why you wouldn’t want to move these FSMO roles to a newer OS though.
Every time I boot my domain controller it logs this warning:
Log Name: Active Directory Web Services
Date: 6/26/2010 10:20:22 PM
Event ID: 1400
Task Category: ADWS Certificate Events
Active Directory Web Services could not find a server certificate with the specified certificate name. A certificate is required to use SSL/TLS connections. To use SSL/TLS connections, verify that a valid server authentication certificate from a trusted Certificate Authority (CA) is installed on the machine.
Certificate name: mydc.contoso.com
It otherwise works fine and I can use ADWS just fine. Do I care about this?
Only if you:
1. You think you have a valid Server Authentication certificate.
2. Want to use SSL to connect to ADWS.
By default Windows Server 2008 R2 DC’s will log this warning until they get issued a valid server certificate (which you get for free once you deploy an MS Enterprise PKI, by getting a Domain Controller certificate through auto-enrollment). Once that happens you will log a 1401 and never see this warning again.
If you think you have the right certificate (and in this case, the customer thought he did – it had EKU of Server Authentication (188.8.131.52.184.108.40.206.1), the right SAN, and chained fine), compare it to a valid DC certificate issued by an MS CA. You can do all this in a test lab even if you’re not using our PKI by just creating a default PKI “next next next” style and examining an exported DC certificate. When we compared the exported certificates, we found that his 3rd-party issued cert was missing a Subject entry, unlike my own. We theorized that this might be it – the subject is not required for a cert to be valid, but any application can decide it’s important and it’s likely ADWS does.
Seeing this error when doing a USMT 4.0 migration:
[0x080000] HARDLINK: cannot find distributed store for d – cee6e189-2fd2-4210-b89a-810397ab3b7f[gle=0x00000002]
[0x0802e3] SelectTransport: OpenDevice failed with Exception: Win32Exception: HARDLINK: cannot find all distributed stores.: There are no more files. [0x00000012] void __cdecl Mig::CMediaManager::SelectTransportInternal(int,unsigned int,struct Mig::IDeviceInitializationData *,int,int,int,unsigned __int64,class Mig::CDeviceProgressAdapter *)
We have a C: and D: drive and when we run the migration we use these steps:
- Scanstate with hard-link for both drives.
- Delete the D: drive partition and extend out C: to use up that space.
- Run the loadstate.
If we don’t delete the D: partition it works fine. I thought all the data was going into the hard-link store on “C:\store”?
Look closer. 🙂 When you create a hard-link store and specify the store path, each volume gets its own hard-link store. Hard-links cannot cross volumes.
Scanstate /hardlink c:\USMTMIG […]
Running this command on a system that contains the operating system on the C: drive and the user data on the D: drive will generate migration stores in the following locations:
The store on C: is called the “main store” and the one on the other drive is called the “distributed store”. If you want to know more about the physicality and limits of the hard-link stores, review: http://technet.microsoft.com/en-us/library/dd560753(WS.10).aspx.
Now, all is not lost – here are some options to get around this:
1. You could not delete the partition (duh).
2. You could move all data from the other partition to your C: drive before running scanstate and get rid of that partition before running scanstate.
3. You could run the scanstate as before, then xcopy the D: drive store into the C: drive store, thereby preserving the data. For example:
a. Scanstate with hard-link.
xcopy /s /e /h /k d:\store\* c:\store
rd /s /q d:\store <– this step optional. After all, you are deleting the partition later!
c. Delete the the D: partition and extend C: like you were doing before.
d. Run loadstate.
There may be other issues here (after all, some application may have been pointing to files on D: and is now very angry) so make sure your plan takes that into consideration. You may need to pay a visit to <locationModify>.
The Black Hat Vegas USA 2010 folks have published their briefings and this one by Ivan Ristic from Qualys really struck me:
State of SSL on the Internet: 2010 Survey, Results and Conclusions
Some mind-blowingly disappointing interesting nuggets from their survey of 867,361 certificates being used by websites:
- Only 37% of domains responded when SSL was attempted (the rest were all totally unencrypted)
- 30% of SSL certificates failed validation (not trusted, not chained, invalid signature)
- 50% of the certs supported insecure SSL v2 protocol
- 56% of servers supported weak (less than 128-bit) ciphers
Definitely read the whole presentation, it’s worth your time. Any questions, ask Jonathan Stephens.
That’s all folks, have a nice weekend.
– Ned “I’m gonna pay for that one” Pyle