Pro: Windows Server 2008, Server Administrator
Question No: 31 – (Topic 1)
Your network consists of a single Active Directory domain. The network contains 20 file servers that run Windows Server 2008 R2. Each file server contains two volumes. One volume contains the operating system.
The other volume contains all data files.
You need to plan a recovery strategy that meets the following requirements:
->Allows the operating system to be restored
->Allows the data files to be restored
->Ensures business continuity
->Minimizes the amount of time to restore the server
What should you include in your plan?
Windows Deployment Services (WDS)
Windows Automated Installation Kit (Windows AIK) and folder redirection
the Multipath I/O feature and Volume Shadow Copies
the Windows Server Backup feature and System Image Recovery
Answer: D Explanation:
MCITP Self-Paced Training Kit Exam 70-646 Windows Server Administration:
Windows Server Backup Windows Server Backup provides a reliable method of backing up and recovering the operating system, certain applications, and files and folders stored on your server. This feature replaces the previous backup feature that was available with earlier versions of Windows.
Windows Server Backup
The Windows Server Backup tool is significantly different from ntbackup.exe, the tool included in Windows Server 2000 and Windows Server 2003. Administrators familiar with the previous tool should study the capabilities and limitations of the new Windows Server Backup utility because many aspects of the tool’s functionality have changed.
Exam Tip: What the tool does
The Windows Server 2008 exams are likely to focus on the differences between NTBACKUP and Windows Server Backup.
The key points to remember about backup in Windows Server 2008 are:
Windows Server Backup cannot write to tape drives.
You cannot write to network locations or optical media during a scheduled backup. The smallest object that you can back up using Windows Server Backup is a volume. Only local NTFS-formatted volumes can be backed up.
Windows Server Backup files write their output as VHD (Virtual Hard Disk) files. VHD files can be mounted with the appropriate software and read, either directly or through virtual machine software such as Hyper-V.
MORE INFO Recovering NTbackup backups
You cannot recover backups written using ntbackup.exe. A special read-only version of ntbackup.exe that is compatible with Windows Server 2008 can be downloaded from http://go.microsoft.com/fwlink/?LinkId=82917.
Windows Server Backup is not installed by default on Windows Server 2008 and must be installed as a feature using the Add Features item under the Features node of the Server Manager console. When installed, the Windows Server Backup node becomes available under the Storage node of the Server Manager Console. You can also open the Windows Server Backup console from the Administrative Tools menu. The wbadmin.exe
command-line utility, also installed during this process, is covered in “The wbadmin Command-Line Tool” later in this lesson. To use Windows Server Backup or wbadmin to schedule backups, the computer requires an extra internal or external disk. External disks will need to be either USB 2.0 or IEEE 1394 compatible. When planning the deployment of disks to host scheduled backup data, you should ensure that the volume is capable of holding at least 2.5 times the amount of data that you want to back up. When planning deployment of disks for scheduled backup, you should monitor how well this size works and what sort of data retention it allows in a trial before deciding on a disk size for wider deployment throughout your organization.
When you configure your first scheduled backup, the disk that will host backup data will be hidden from Windows Explorer. If the disk currently hosts volumes and data, these will be removed to store scheduled backup data. Note that this only applies to scheduled backups and not to manual backups. You can use a network location or external disk for a manual backup without worrying that data already stored on the device will be lost. The format and repartition only happens when a device is first used to host scheduled backup data.
It does not happen when subsequent backup data is written to the same location.
It is also important to remember that a volume can only store a maximum of 512 backups. If you need to store a greater number of backups, you will need to write these backups to a different volume. Of course given the amount of data on most servers, you are unlikely to find a disk that has the capacity to store so many backups.
So that scheduled backups can always be executed, Windows Server Backup will automatically remove the oldest backup data on a volume that is the target of scheduled backups. You do not need to manually clean up or remove old backup data.
Performing a Scheduled Backup
Scheduled backups allow you to automate the backup process. After you set the schedule, Windows Server Backup takes care of everything else. By default, scheduled backups are set to occur at 9:00 P.M. If your organization still has people regularly working on documents at that time, you should reset this. When planning a backup schedule you should ensure that the backup occurs at a time when the most recent day’s changes to data are always captured. Only members of the local Administrators group can configure and manage scheduled backups.
To configure a scheduled backup, perform the following steps:
Open Windows Server Backup. Click Backup Schedule in the Actions pane ofWindows Server Backup. This will start the Backup Schedule Wizard. Click Next.
The next page of the wizard asks whether you want to perform a full server backup or a
Select Custom and click Next. As you can see in Figure 12-3, volumes that contain operating system components are always included in custom backups. Volume E is excluded in this case, because this is the location where backup data will be written.
Figure 12-3Selecting backup items
The default backup schedule is once a day at 9:00 P.M. You can configure multiple backups to be taken during the day. You are most likely to do this in the event that data on the server that you are backing up changes rapidly. On servers where data changes a lot less often, such as on a Web server where pages are only updated once a week, you would configure a more infrequent schedule.
On the Select Destination Disk page, shown in Figure 12-4, you select the disk that backups are written to. If multiple disks are selected, multiple copies of the backup data are written. You should note that the entire disk will be used. All existing volumes and data will be removed and the backup utility will format and hide the disks prior to writing the first backup data.
On the Label Destination Disk page, note the label given to the disk you have selected to store backups. When you finish the wizard, the target destination is formatted and then the first backup will occur at the scheduled time.
An important limitation of Windows Server Backup is that you can only schedule one backup job. In other words, you cannot use Windows Server Backup to schedule jobs that
you might be used to scheduling in earlier versions of Windows, such as a full backup on Monday night with a series of incremental backups every other day of the week. You can configure Windows Server Backup to perform incremental backups, but this process is different from what you might be used to with other backup applications.
Figure 12-4Selecting a destination disk
Performing an Unscheduled Single Backup
Unscheduled single backups, also known as manual backups, can be written to network locations, local and external volumes, and local DVD media. If a backup encompasses more than the space available on a single DVD media, you can span the backup across multiple DVDs. Otherwise, if the calculated size of a backup exceeds the amount of free space available on the destination location, the backup will fail. You will perform a manual backup in a practice exercise at the end of this lesson.
When performing a manual backup, you must choose between using one of the following two types of Volume
Shadow Copy Service backup:
VSS Copy BackupUse this backup option when another backup product is also used to back up applications on volumes in the current backup. Application log files are retained when you perform this type of manual backup. This is the default when taking a backup.
VSS Full BackupUse this backup option when no other backup products are used to back up the host computer. This option will update each file’s backup attribute and clears application log files.
When performing a single backup, you can also back up a single volume without having to back up the system or boot volumes. This is done by clearing the Enable System Recovery option when selecting backup items.
You might use this option to back up a specific volume’s data when you are going to perform maintenance on the volume or suspect that the disk hosting the volume might fail, but do not want to wait for a full server backup to complete.
Full Server and Operating System Recovery
Also known as Bare Metal Recovery, full server recovery allows you to completely restore the server by booting from the Windows Server 2008 installation media or Windows Recovery Environment. See the note on building a recovery solution for more information on how to set up a local Windows Recovery Environment on a Windows Server 2008 computer. Full server recovery goes further than the Automated System Recovery (ASR) feature that was available in Windows Server 2003 because full server recovery will restore all operating system, application, and other data stored on the server. ASR did not provide such a complete recovery and it was necessary to further restore data from backup after the ASR process was complete.
An operating system recovery is similar to a full server recovery except that you only recover critical volumes and do not recover volumes that do not contain critical data. For example, if you have a file server where the disks that host critical operating system volumes are separate from the disks that host shared folder volumes and the disks that host the critical operating system volumes fail, you should perform an operating system recovery.
Figure 12-13Select Windows Complete PC Restore
Question No: 32 HOTSPOT – (Topic 1)
You are designing a monitoring solution to log performance for servers that run Windows Server 2008 R2.
The monitoring solution must allow members of the Performance Log Users group to create and modify Data Collector Sets.
You need to grant members of the Performance Log Users group the necessary permissions.
Which User Rights Assignment policy should you configure?
To answer, select the appropriate User Rights Assignment policy in the answer area.
Log on as a batch job
This policy setting determines which accounts can log on by using a batch-queue tool such as the Task Scheduler service. When an administrator uses the Add Scheduled Task wizard to schedule a task to run under a particular user name and password, that user is automatically assigned the Log on as a batch job user right. When the scheduled time
arrives, the Task Scheduler service logs the user on as a batch job instead of as an interactive user, and the task runs in the user#39;s security context.
User-defined list of accounts Not Defined
The Log on as a batch job user right presents a low-risk vulnerability. For most organizations, the default setting of Not Defined is sufficient. Members of the local Administrators group have this right by default.
You should allow the computer to manage this logon right automatically if you want to allow scheduled tasks to run for specific user accounts. If you do not want to use the Task Scheduler in this manner, configure the Log on as a batch job user right for only the Local Service account.
For IIS servers, you should configure this policy locally instead of through domain-based Group Policy settings so that you can ensure that the local IUSR_lt;ComputerNamegt; and IWAM_lt;ComputerNamegt; accounts have this logon right.
If you configure the Log on as a batch job setting by using domain-based Group Policy settings, the computer cannot assign the user right to accounts that are used for scheduled jobs in the Task Scheduler. If you install optional components such as ASP.NET or IIS, you may need to assign this user right to additional accounts that are required by those components. For example, IIS requires assignment of this user right to the IIS_WPG group and the IUSR_lt;ComputerNamegt;, ASPNET, and IWAM_lt;ComputerNamegt; accounts. If this user right is not assigned to this group and these accounts, IIS cannot run some COM objects that are necessary for proper functionality.
Question No: 33 – (Topic 1)
Your network consists of a single Active Directory domain. Your network contains 10 servers and 500 client computers. All domain controllers run Windows Server 2008 R2.
A Windows Server 2008 R2 server has Remote Desktop Services installed. All client computers run Windows XP Service Pack 3.
You plan to deploy a new line of business Application. The Application requires desktop themes to be enabled.
You need to recommend a deployment strategy that meets the following requirements:
->Only authorized users must be allowed to access the Application.
->Authorized users must be able to access the Application from any client computer.
->Your strategy must minimize changes to the client computers.
->Your strategy must minimize software costs.
What should you recommend?
Migrate all client computers to Windows 7. Deploy the Application to all client computers by using a Group Policy object (GPO).
Migrate all client computers to Windows 7. Deploy the Application to the authorized users by using a Group Policy object (GPO).
Deploy the Remote Desktop Connection (RDC) 7.0 software to the client computers. Install the Application on the Remote Desktop Services server. Implement Remote Desktop Connection Broker (RD Connection Broker).
Deploy the Remote Desktop Connection (RDC) 7.0 software to the client computers. Enable the Desktop Experience feature on the Remote Desktop Services server. Install the Application on the Remote Desktop Services server.
Answer: D Explanation: Desktop Experience
Configuring a Windows Server 2008 server as a terminal server lets you use Remote Desktop Connection 6.0 to connect to a remote computer from your administrator workstation and reproduces on your computer the desktop that exists on the remote computer. When you install Desktop Experience on Windows Server 2008, you can use Windows Vista features such as Windows Media Player, desktop themes, and photo management within the remote connection.
Question No: 34 – (Topic 1)
Your network consists of a single Active Directory domain. All domain controllers run Windows Server 2008 R2. There are five servers that run Windows Server 2003 SP2. The Windows Server 2003 SP2 servers have the Terminal Server component installed. A firewall server runs Microsoft Internet Security and Acceleration (ISA) Server 2006. All client computers run Windows 7.
You plan to give remote users access to the Remote Desktop Services servers.
You need to create a remote access strategy for the Remote Desktop Services servers that meets the following requirements:
->Minimizes the number of open ports on the firewall server
->Encrypts all remote connections to the Remote Desktop Services servers
->Prevents network access to client computers that have Windows Firewall disabled
What should you do?
Implement port forwarding on the ISA Server. Implement Network Access Quarantine Control on the ISA Server.
Upgrade a Windows Server 2003 SP2 server to Windows Server 2008 R2. On the Windows Server 2008 R2 server, implement the Remote Desktop Gateway (RD Gateway) role service, and implement Network Access Protection (NAP).
Upgrade a Windows Server 2003 SP2 server to Windows Server 2008 R2. On the Windows Server 2008 R2 server, implement the Remote Desktop Gateway (RD Gateway) role service, and configure a Remote Desktop connection authorization policy (RD?CAP).
Upgrade a Windows Server 2003 SP2 server to Windows Server 2008 R2. On the Windows Server 2008 R2 server, implement the Remote Desktop Gateway (RD Gateway) role service, and configure a Remote Desktop resource authorization policy (RD RAP).
Answer: B Explanation:
Terminal Services Gateway
TS Gateway allows Internet clients secure, encrypted access to Terminal Servers behind your organization’s firewall without having to deploy a Virtual Private Network (VPN) solution. This means that you can have users interacting with their corporate desktop or applications from the comfort of their homes without the problems that occur when VPNs are configured to run over multiple Network Address Translation (NAT) gateways and the firewalls of multiple vendors.
TS Gateway works using RDP over Secure Hypertext Transfer Protocol (HTTPS), which is the same protocol used by Microsoft Office Outlook 2007 to access corporate Exchange Server 2007 Client Access Servers over the Internet. TS Gateway Servers can be configured with connection authorization policies and resource authorization policies as a way of differentiating access to Terminal Servers and network resources.
Connection authorization policies allow access based on a set of conditions specified by the administrator; resource authorization policies grant access to specific Terminal Server resources based on user account properties.
Network Access Protection
You deploy Network Access Protection on your network as a method of ensuring that computers accessing important resources meet certain client health benchmarks. These benchmarks include (but are not limited to) having the most recent updates applied, having
antivirus and anti-spyware software up to date, and having important security technologies such as Windows Firewall configured and functional. In this lesson, you will learn how to plan and deploy an appropriate network access protection infrastructure and enforcement method for your organization.
Question No: 35 – (Topic 1)
Your network consists of a single Active Directory domain. The network contains two Windows Server 2008 R2 computers named Server1 and Server2. The company has two identical print devices. You plan to deploy print services.
You need to plan a print services infrastructure to meet the following requirements:
->Manage the print queue from a central location.
->Make the print services available, even if one of the print devices fails.
What should you include in your plan?
Install and share a printer on Server1. Enable printer pooling.
Install the Remote Desktop Services server role on both servers. Configure Remote Desktop Connection Broker (RD Connection Broker).
Install and share a printer on Server1. Install and share a printer on Server2. Use Print Management to install the printers on the client computers.
Add Server1 and Server2 to a Network Load Balancing cluster. Install a printer on each node of the cluster.
Answer: A Explanation:
Managing printers can be the bane of a Windows administrator. One feature that may assist you with this task is the Windows printer pooling feature. Windows Server 2008 offers functionality that permits a collection of multiple like-configured printers to distribute the print workload.
Printer pooling makes one share that clients print to, and the jobs are sent to the first available printer. Configuring print pooling is rather straightforward in the Windows printer configuration applet of the Control Panel. Figure A shows two like-modeled printers being
To use pooling, the printer models need to be the same so that the driver configuration is transparent to the end device; this can also help control costs of toner and other supplies. But plan accordingly – you don’t want users essentially running track to look for their print jobs on every printer in the office.
Question No: 36 – (Topic 1)
Your company has several branch offices.
Your network consists of a single Active Directory domain. Each branch office contains domain controllers and member servers. The domain controllers run Windows Server 2003 SP2. The member servers run Windows Server 2008 R2.
Physical security of the servers at the branch offices is a concern.
You plan to implement Windows BitLocker Drive Encryption (BitLocker) on the member servers.
You need to ensure that you can access the BitLocker volume if the BitLocker keys are
corrupted on the member servers. The recovery information must be stored in a central location.
What should you do?
Upgrade all domain controllers to Windows Server 2008 R2. Use Group Policy to configure Public Key Policies.
Upgrade all domain controllers to Windows Server 2008 R2. Use Group Policy to enable Trusted Platform Module (TPM) backups to Active Directory.
Upgrade the domain controller that has the schema master role to Windows Server 2008 R2. Use Group Policy to enable a Data Recovery Agent (DRA).
Upgrade the domain controller that has the primary domain controller (PDC) emulator role to Windows Server 2008 R2. Use Group Policy to enable a Data Recovery Agent (DRA).
Answer: B Explanation:
MCITP Self-Paced Training Kit Exam 70-646 Windows Server Administration: Planning BitLocker Deployment
Windows BitLocker and Drive Encryption (BitLocker) is a feature that debuted in Windows Vista Enterprise and Ultimate Editions and is available in all versions of Windows Server 2008. BitLocker serves two purposes:
protecting server data through full volume encryption and providing an integrity-checking mechanism to ensure that the boot environment has not been tampered with.
Encrypting the entire operating system and data volumes means that not only are the operating system and data protected, but so are paging files, applications, and application configuration data. In the event that a server is stolen or a hard disk drive removed from a server by third parties for their own nefarious purposes, BitLocker ensures that these third parties cannot recover any useful data. The drawback is that if the BitLocker keys for a server are lost and the boot environment is compromised, the data stored on that server will be unrecoverable.
To support integrity checking, BitLocker requires a computer to have a chip capable of supporting the Trusted Platform Module (TPM) 1.2 or later standard. A computer must also have a BIOS that supports the TPM standard. When BitLocker is implemented in these conditions and in the event that the condition of a startup component has changed, BitLocker-protected volumes are locked and cannot be unlocked unless the person doing the unlocking has the correct digital keys. Protected startup components include the BIOS, Master Boot Record, Boot Sector, Boot Manager, and Windows Loader.
From a systems administration perspective, it is important to disable BitLocker during
maintenance periods when any of these components are being altered. For example, you must disable BitLocker during a BIOS upgrade. If you do not, the next time the computer starts, BitLocker will lock the volumes and you will need to initiate the recovery process. The recovery process involves entering a 48-character password that is generated and saved to a specified location when running the BitLocker setup wizard. This password should be stored securely because without it the recovery process cannot occur. You can also configure BitLocker to save recovery data directly to Active Directory; this is the recommended management method in enterprise environments.
You can also implement BitLocker without a TPM chip. When implemented in this manner there is no startup integrity check. A key is stored on a removable USB memory device, which must be present and supported by the computer’s BIOS each time the computer starts up. After the computer has successfully started, the removable USB memory device can be removed and should then be stored in a secure location. Configuring a computer running Windows Server 2008 to use a removable USB memory device as a BitLocker startup key is covered in the second practice at the end of this lesson.
BitLocker Group Policies
BitLocker group policies are located under the Computer Configuration\Policies\ Administrative Templates\Windows Components\BitLocker Drive Encryption node of a Windows Server 2008 Group Policy object. In the event that the computers you want to deploy BitLocker on do not have TPM chips, you can use the Control Panel Setup: Enable Advanced Startup Options policy, which is shown in Figure 1-7. When this policy is enabled and configured, you can implement BitLocker without a TPM being present. You can also configure this policy to require that a startup code be entered if a TPM chip is present, providing another layer of security.
Figure 1-7Allowing BitLocker without the TPM chip Other BitLocker policies include:
Turn On BitLocker Backup To Active Directory Domain Services When this policy is enabled, a computer’s recovery key is stored in Active Directory and can be recovered by an authorized administrator.
Control Panel Setup: Configure Recovery Folder When enabled, this policy sets the default folder to which computer recovery keys can be stored.
Question No: 37 – (Topic 1)
Your network consists of a single Active Directory domain. The network includes a branch office named Branch1. Branch1 contains 50 member servers that run Windows Server 2008 R2. An organizational unit (OU) named Branch1Servers contains the computer objects for the servers in Branch1. A global group named Branch1admins contains the user accounts for the administrators. Administrators maintain all member servers in Branch1.
You need to recommend a solution that allows the members of Branch1admins group to perform the following tasks on the Branch1 member servers.
->Stop and start services
->Change registry settings
What should you recommend?
Add the Branch1admins group to the Power Users local group on each server in Branch1.
Add the Branch1admins group to the Administrators local group on each server in Branch1.
Assign the Branch1admins group change permissions to the Branch1Servers OU and to all child objects.
Assign the Branch1admins group Full Control permissions on the Branch1Servers OU and to all child objects.
Answer: B Explanation:
Local admins have these rights. Power Users do not
By default, members of the power users group have no more user rights or permissions than a standard user account. The Power Users group in previous versions of Windows was designed to give users specific administrator rights and permissions to perform common system tasks. In this version of Windows, standard user accounts inherently have the ability to perform most common configuration tasks, such as changing time zones. For legacy applications that require the same Power User rights and permissions that were present in previous versions of Windows, administrators can apply a security template that enables the Power Users group to assume the same rights and permissions that were present in previous versions of Windows.
Question No: 38 – (Topic 1)
Your network contains a Windows Server 2008 R2 server that functions as a file server. All users have laptop computers that run Windows 7.
The network is not connected to the Internet. Users save files to a shared folder on the server.
You need to design a data provisioning solution that meets the following requirements:
->Users who are not connected to the corporate network must be able to access the files and the folders in the corporate network.
->Unauthorized users must not have access to the cached files and folders.
What should you do?
Implement a certification authority (CA). Configure IPsec domain isolation.
Implement a certification authority (CA). Configure Encrypting File System (EFS) for the drive that hosts the files.
Implement Microsoft SharePoint Foundation 2010. Enable Secure Socket Layer (SSL) encryption.
Configure caching on the shared folder. Configure offline files to use encryption.
Answer: D Explanation:
MCITP Self-Paced Training Kit Exam 70-646 Windows Server Administration: Lesson 2: Provisioning Data
Lesson 1 in this chapter introduced the Share And Storage Management tool, which gives you access to the Provision Storage Wizard and the Provision A Shared Folder Wizard.
These tools allow you to configure storage on the volumes accessed by your server and to set up shares. When you add the Distributed File System (DFS) role service to the File Services server role you can create a DFS Namespace and go on to configure DFSR. Provisioning data ensures that user files are available and remain available even if a server fails or a WAN link goes down. Provisioning data also ensures that users canwork on important files when they are not connected to the corporate network.
In a well-designed data provisioning scheme, users should not need to know the network path to their files, or from which server they are downloading them. Even large files should typically download quickly-files should not be downloaded or saved across a WAN link when they are available from a local server. You need to configure indexing so that users can find information quickly and easily. Offline files need to be synchronized quickly and efficiently, and whenever possible without user intervention. A user should always be working with the most up-to-date information (except when a shadow copy is specified) and fast and efficient replication should ensure that where several copies of a file exist on a network they contain the same information and latency is minimized.
You have several tools that you use to configure shares and offline files, configure storage, audit file access, prevent inappropriate access, prevent users from using excessive disk resource, and implement disaster recovery. However, the main tool for provisioning storage and implementing a shared folder structure is DFS Management, specifically DFS Namespaces. The main tool for implementing shared folder replication in a
Windows Server 2008 network is DFS Replication.
Question No: 39 HOTSPOT – (Topic 1)
A company runs a third-party DHCP Application on a windows Server 2008 R2 server. The Application runs as a service that launches a background process upon startup.
The company plans to migrate the DHCP Application to a Windows Server 2008 R2 failover cluster.
You need to provide high availability for the DHCP Application. Which service or Application should you configure?
To answer, select the appropriate service or Application in the answer area.
Windows Server 2008 (and R2) Failover Clustering supports virtually every workload which comes with Windows Server, however there are many custom and 3rd party applications which take advantage of our infrastructure to provide high-availability. Additionally there are some applications which were not originally designed to run in a failover cluster. These can be created, managed by and integrated with Failover Clustering using a generic container, with applications using the Generic Application resource type.
We use the Generic Application resource type to enable such applications to run in a highly-available environment which can benefit from clustering features (i.e. high availability, failover, etc.).
When a generic application resource is online, it means that the application is running. When a generic application is offline, it means that the application is not running. http://blogs.msdn.com/b/clustering/archive/2009/04/10/9542115.aspx
A cluster-unaware application is distinguished by the following features.
The application does not use the Failover Cluster API. Therefore, it cannot discover information about the cluster environment, interact with cluster objects, detect that it is running in a cluster, or change its behavior between clustered and non-clustered systems. If the application is managed as a cluster resource, it is managed as a Generic Application resource type or Generic Service resource type. These resource types provide very basic routines for failure detection and application shutdown. Therefore, a cluster-unaware application might not be able to perform the initialization and cleanup tasks needed for it to be consistently available in the cluster.
Most older applications are cluster-unaware. However, a cluster-unaware application can be made clusteraware by creating resource types to manage the application. A custom resource type provides the initialization, cleanup, and management routines specific to the needs of the application.
There is nothing inherently wrong with cluster-unaware applications. As long as they are functioning and highly available to cluster resources when managed as Generic Applications or Generic Services, there is no need to make them cluster-aware. However, if an application does not start, stop, or failover consistently when managed by the generic types, it should be made cluster-aware.
Question No: 40 – (Topic 1)
Your network consists of a single Active Directory forest. The forest contains one Active Directory domain. The domain contains eight domain controllers. The domain controllers run Windows Server 2003 Service Pack 2.
You upgrade one of the domain controllers to Windows Server 2008 R2.
You need to recommend an Active Directory recovery strategy that supports the recovery of deleted objects.
The solution must allow deleted objects to be recovered for up to one year after the date of deletion.
What should you recommend?
Increase the tombstone lifetime for the forest.
Increase the interval of the garbage collection process for the forest.
Configure daily backups of the Windows Server 2008 R2 domain controller.
Enable shadow copies of the drive that contains the Ntds.dit file on the Windows Server 2008 R2 domain controller.
Answer: A Explanation:
The tombstone lifetime must be substantially longer than the expected replication latency between the domain controllers. The interval between cycles of deleting tombstones must be at least as long as the maximum replication propagation delay across the forest.
Because the expiration of a tombstone lifetime is based on the time when an object was deleted logically, rather than on the time when a particular server received that tombstone through replication, an object#39;s tombstone is collected as garbage on all servers at approximately the same time. If the tombstone has not yet replicated to a particular domain controller, that DC never records the deletion. This is the reason why you cannot restore a domain controller from a backup that is older than the tombstone lifetime
By default, the Active Directory tombstone lifetime is sixty days. This value can be changed if necessary. To change this value, the tombstoneLifetime attribute of the CN=Directory Service object in the configuration partition must be modified.
This is related to server 2003 but should still be relelvant http://www.petri.co.il/ changing_the_tombstone_lifetime_windows_ad.htm
When a nonauthoritative restore is performed, objects deleted after the backup was taken will again be deleted when the restored DC replicates with other servers in the domain. On every other DC the object is marked as deleted so that when replication occurs the local copy of the object will also be marked as deleted. The authoritative restore process marks the deleted object in such a way that when replication occurs, the object is restored to active status across the domain. It is important to remember that when an object is deleted it is not instantly removed from Active Directory, but gains an attribute that marks it as deleted until the tombstone lifetime is reached and the object is removed. The tombstone lifetime is the amount of time a deleted object remains in Active Directory and has a default value of 180 days.
To ensure that the Active Directory database is not updated before the authoritative restore takes place, you use the Directory Services Restore Mode (DSRM) when performing the authoritative restore process. DSRM allows the administrator to perform the necessary restorations and mark the objects as restored before rebooting the DC and allowing those
changes to replicate out to other DCs in the domain.
|Lowest Price Guarantee||Yes||No||No|
|Free VCE Simulator||Yes||No||No|