Drivers Microsoft Graphic

  1. Drivers Microsoft Graphic Software

In Internet Explorer, click Tools, and then click Internet Options. On the Security tab, click the Trusted Sites icon. Click Sites and then add these website addresses one at a time to the list: You can only add one address at a time and you must click Add after each one. If you install Windows on a computer that is using an unsupported video adapter, Windows Setup installs a standard VGA mode driver. However, after you install Windows, you obtain and install a Windows-compatible driver for your video adapter from an original equipment manufacturer (OEM). Microsoft® ODBC Driver 13.1 for SQL Server® - Windows, Linux, & macOS. The Microsoft ODBC Driver for SQL Server provides native connectivity from Windows, Linux, & macOS to Microsoft SQL Server and Microsoft Azure SQL Database. Starting at some point after Windows 10, version 2004, drivers that run on Windows will be classified as either Windows Drivers or Windows Desktop Drivers. Windows Drivers will run on all Window 10 variants, including Windows 10X and Windows 10 Desktop editions. Windows Desktop Drivers will only run on Windows 10 Desktop editions. Update the device driver. In the search box on the taskbar, enter device manager, then select Device Manager. Select a category to see names of devices, then right-click (or press and hold) the one you’d like to update. Select Search automatically for updated driver software. Select Update Driver.

-->

To take advantage of the GPU capabilities of Azure N-series VMs backed by NVIDIA GPUs, you must install NVIDIA GPU drivers. The NVIDIA GPU Driver Extension installs appropriate NVIDIA CUDA or GRID drivers on an N-series VM. Install or manage the extension using the Azure portal or tools such as Azure PowerShell or Azure Resource Manager templates. See the NVIDIA GPU Driver Extension documentation for supported operating systems and deployment steps.

If you choose to install NVIDIA GPU drivers manually, this article provides supported operating systems, drivers, and installation and verification steps. Manual driver setup information is also available for Linux VMs.

For basic specs, storage capacities, and disk details, see GPU Windows VM sizes.

Supported operating systems and drivers

NVIDIA Tesla (CUDA) drivers

NVIDIA Tesla (CUDA) drivers for NC, NCv2, NCv3, NCasT4_v3, ND, and NDv2-series VMs (optional for NV-series) are supported only on the operating systems listed in the following table. Driver download links are current at time of publication. For the latest drivers, visit the NVIDIA website.

Tip

As an alternative to manual CUDA driver installation on a Windows Server VM, you can deploy an Azure Data Science Virtual Machine image. The DSVM editions for Windows Server 2016 pre-install NVIDIA CUDA drivers, the CUDA Deep Neural Network Library, and other tools.

OSDriver
Windows Server 2019451.82 (.exe)
Windows Server 2016451.82 (.exe)

NVIDIA GRID drivers

Microsoft redistributes NVIDIA GRID driver installers for NV and NVv3-series VMs used as virtual workstations or for virtual applications. Install only these GRID drivers on Azure NV-series VMs, only on the operating systems listed in the following table. These drivers include licensing for GRID Virtual GPU Software in Azure. You do not need to set up a NVIDIA vGPU software license server.

The GRID drivers redistributed by Azure do not work on non-NV series VMs like NCv2, NCv3, ND, and NDv2-series VMs. The one exception is the NCas_T4_V3 VM series where the GRID drivers will enable the graphics functionalities similar to NV-series.

The NC-Series with Nvidia K80 GPUs do not support GRID/graphics applications.

Please note that the Nvidia extension will always install the latest driver. We provide links to the previous version here for customers, who have dependency on an older version.

For Windows Server 2019, Windows Server 2016 1607, 1709, and Windows 10(up to build 20H2):

Drivers
  • GRID 12.0 (461.09) (.exe)
  • GRID 11.3 (452.77) (.exe)

For Windows Server 2012 R2:

  • GRID 12.0 (461.09) (.exe)
  • GRID 11.3 (452.77) (.exe)

For the complete list of all previous Nvidia GRID driver links please visit GitHub

Driver installation

  1. Connect by Remote Desktop to each N-series VM.

  2. Download, extract, and install the supported driver for your Windows operating system.

After GRID driver installation on a VM, a restart is required. After CUDA driver installation, a restart is not required.

Verify driver installation

Please note that the Nvidia Control panel is only accessible with the GRID driver installation. If you have installed CUDA drivers then the Nvidia control panel will not be visible.

You can verify driver installation in Device Manager. The following example shows successful configuration of the Tesla K80 card on an Azure NC VM.

To query the GPU device state, run the nvidia-smi command-line utility installed with the driver.

  1. Open a command prompt and change to the C:Program FilesNVIDIA CorporationNVSMI directory.

  2. Run nvidia-smi. If the driver is installed, you will see output similar to the following. The GPU-Util shows 0% unless you are currently running a GPU workload on the VM. Your driver version and GPU details may be different from the ones shown.

RDMA network connectivity

RDMA network connectivity can be enabled on RDMA-capable N-series VMs such as NC24r deployed in the same availability set or in a single placement group in a virtual machine scale set. The HpcVmDrivers extension must be added to install Windows network device drivers that enable RDMA connectivity. To add the VM extension to an RDMA-enabled N-series VM, use Azure PowerShell cmdlets for Azure Resource Manager.

To install the latest version 1.1 HpcVMDrivers extension on an existing RDMA-capable VM named myVM in the West US region:

For more information, see Virtual machine extensions and features for Windows.

The RDMA network supports Message Passing Interface (MPI) traffic for applications running with Microsoft MPI or Intel MPI 5.x.

Next steps

  • Developers building GPU-accelerated applications for the NVIDIA Tesla GPUs can also download and install the latest CUDA Toolkit. For more information, see the CUDA Installation Guide.
-->

Applies to: Windows Server 2016, Microsoft Hyper-V Server 2016

Note

Because of security concerns, RemoteFX vGPU is disabled by default on all versions of Windows starting with the July 14, 2020 Security Update. To learn more, see KB 4570006.

The vGPU feature for RemoteFX makes it possible for multiple virtual machines to share a physical GPU. Rendering and compute resources are shared dynamically among virtual machines, making RemoteFX vGPU appropriate for high-burst workloads where dedicated GPU resources are not required. For example, in a VDI service, RemoteFX vGPU can be used to offload app rendering costs to the GPU, with the effect of decreasing CPU load and improving service scalability.

RemoteFX vGPU requirements

Host system requirements:

  • Windows Server 2016
  • A DirectX 11.0-compatible GPU with a WDDM 1.2-compatible driver
  • A CPU with Second Level Address Translation (SLAT) support

Guest VM requirements:

  • Supported guest OS. For more information, see RemoteFX 3D Video Adapter (vGPU) support.

Additional considerations for guest VMs:

  • OpenGL and OpenCL functionality is only available in guests running Windows 10 or Windows Server 2016.
  • DirectX 11.0 is only available for guests running Windows 8 or later.

Enable RemoteFX vGPU

To configure RemoteFX vGPU on your Windows Server 2016 host:

Drivers Microsoft Graphic
  1. Install the graphics drivers recommended by your GPU vendor for Windows Server 2016.
  2. Create a VM running a guest OS supported by RemoteFX vGPU. To learn more, see RemoteFX 3D Video Adapter (vGPU) support.
  3. Add the RemoteFX 3D graphics adapter to the VM. To learn more, see Configure the RemoteFX vGPU 3D adapter.

By default, RemoteFX vGPU will use all available and supported GPUs. To limit which GPUs the RemoteFX vGPU uses, follow these steps:

  1. Navigate to the Hyper-V settings in Hyper-V Manager.
  2. Select Physical GPUs in Hyper-V Settings.
  3. Select the GPU that you don't want to use, and then clear Use this GPU with RemoteFX.

Configure the RemoteFX vGPU 3D adapter

You can use either the Hyper-V Manager UI or PowerShell cmdlets to configure the RemoteFX vGPU 3D graphics adapter.

Configure RemoteFX vGPU with Hyper-V Manager

  1. Stop the VM if it's currently running.

  2. Open Hyper-V Manager, navigate to VM Settings, then select Add Hardware.

  3. Select RemoteFX 3D Graphics Adapter, then select Add.

  4. Set the maximum number of monitors, maximum monitor resolution, and dedicated video memory, or leave the default values.

    Download

    Note

    • Setting higher values for any of these options will impact your service scale, so you should only set what is necessary.
    • When you need to use 1 GB of dedicated VRAM, use a 64-bit guest VM instead of 32-bit (x86) for best results.
  5. Select OK to finish the configuration.

Configure RemoteFX vGPU with PowerShell cmdlets

Use the following PowerShell cmdlets to add, review, and configure the adapter:

Monitor performance

The performance and scale of a RemoteFX vGPU-enabled service are determined by a variety of factors such as number of GPUs on your system, total GPU memory, amount of system memory and memory speed, number of CPU cores and CPU clock frequency, storage speed, and NUMA implementation.

Host system memory

For every VM enabled with a vGPU, RemoteFX uses system memory both in the guest operating system and in the host server. The hypervisor guarantees the availability of system memory for a guest operating system. On the host, each vGPU-enabled virtual desktop needs to advertise its system memory requirement to the hypervisor. When the vGPU-enabled virtual desktop starts, the hypervisor reserves additional system memory in the host.

The memory requirement for the RemoteFX-enabled server is dynamic because the amount of memory consumed on the RemoteFX-enabled server is dependent on the number of monitors that are associated with the vGPU-enabled virtual desktops and the maximum resolution for those monitors.

Host GPU video memory

Every vGPU-enabled virtual desktop uses the GPU hardware video memory on the host server to render the desktop. In addition, a codec uses the video memory to compress the rendered screen. The amount of memory needed for rendering and compression is directly based on the number of monitors provisioned to the virtual machine. The amount of reserved video memory varies based on the system screen resolution and how many monitors there are. Some users require a higher screen resolution for specific tasks, but there's greater scalability with lower resolution settings if all other settings remain constant.

3nod hzh116ct driver download for windows 10

Host CPU

The hypervisor schedules the host and VMs on the CPU. The overhead is increased on a RemoteFX-enabled host because the system runs an additional process (rdvgm.exe) per vGPU-enabled virtual desktop. This process uses the graphics device driver to run commands on the GPU. The codec also uses the CPU to compress screen data that needs to be sent back to the client.

More virtual processors mean a better user experience. We recommend allocating at least two virtual CPUs per vGPU-enabled virtual desktop. We also recommend using the x64 architecture for vGPU-enabled virtual desktops because the performance on x64 virtual machines is better compared to x86 virtual machines.

GPU processing power

Every vGPU-enabled virtual desktop has a corresponding DirectX process that runs on the host server. This process replays all graphics commands it receives from the RemoteFX virtual desktop onto the physical GPU. This is like running multiple DirectX applications at the same time on the same physical GPU.

Usually, graphics devices and drivers are tuned to run only a few applications on the desktop at a time, but RemoteFX stretches the GPUs to go even further. vGPUs come with performance counters that measure the GPU response to RemoteFX requests and help you make sure the GPUs aren't stretched too far.

When a GPU is low on resources, read and write operations take a long time to complete. Administrators can use performance counters to know when to adjust resources and prevent downtime for users.

Drivers Microsoft Graphic Software

Learn more about performance counters for monitoring RemoteFX vGPU behavior at Diagnose graphics performance issues in Remote Desktop.