Windows ARM installing Azure CLI

This is for Powershell Core, Windows Powershell works with Win32 version through emulation.

Another quick tip for those developers out there with their shiny new Surface Laptop 7 or similar with Windows on ARM on it. Tried installing Azure CLI already, and everything went fine, but trying to run it didn’t work? You are not alone, as the official support at the time of writing (July 2024) is still lacking.

I managed to get around this, and here are the steps for you not to waste time and pull hair out in frustration, when chatGPT or Claude AI won’t tell you how to fix this.

Notice that this is a bit of a hacky thing, we create a virtual environment where we run it, but hey, if it works… right?

Prereq is to install Build tools through Visual Studio 2022 Free https://visualstudio.microsoft.com/downloads/ and select Desktop development with C++ and install all ARM related things. You also need to install Rust from https://forge.rust-lang.org/infra/other-installation-methods.html and select aarch64-pc-windows-msvc version.Install the Cargo folder to your path, something like “C:\users\USER\.rustup\toolchains\stable-aarch64-pc-windows-msvc\bin\”.

  • Make sure you have the latest PowerShell Core:
  • winget install --id Microsoft.Powershell --source winget

  • Enable running installers by running this command on Powershell
  • set-executionpolicy remotesigned

  • Next install from the Powershell latest Python (3.12 for time of writing).
  • curl https://www.python.org/ftp/python/3.12.0/python-3.12.0-arm64.exe -O python-3.12.0-arm64.exe

    Now we have prerequisites in place, and it’s time for some Powershell magic:

    # Create a virtual environment
    python3 -m venv azure-cli-env
    # Update pip
    azure-cli-env/scripts/python -m pip install --upgrade pip
    # Install azure-cli
    azure-cli-env/scripts/python -m pip install azure-cli
    azure-cli-env/scripts/python -m pip install setuptools

    and after that, run this:

    notepad $profile

    Add to your profile file the following:

    Set-Alias -Name az -Value "C:\azure-cli-env\Scripts\az.bat"

    Restart your powershell, and running az should work now. Goes without saying, make sure the virtual environment path matches the location in the Set-Alias command.

    Posted in WindowsARM | Tagged , | 1 Comment

    Windows ARM installing Teams Outlook addon

    I got myself a new Surface 7 laptop with Windows on ARM recently, and found out that there are some things that just don’t work out of the box. In the light of saving everyone else time, I’m putting up a few nuggets here how to save you hours of roaming the depths of the internet and getting frustrated with hallucinating AI helps.

    First think you need to do is to install the old Teams client, which of course would be nice if it would just work (which it doesn’t). So follow these steps to install it.

  • Close Outlook and Teams clients, first log out of the Teams (important)
  • Go to Download old Teams ARM64 and run the ARM64 installer, and never mind the instructions.
  • Open %SystemDrive%\Program Files (x86)\Teams Installer and run the installer
  • Now verify that you have %LocalAppData%\Microsoft\TeamsMeetingAddin folder, if not, restart and run the installer again.
  • Open command prompt and go to the folder. There should be subfolder similar to something like 1.0.24151.2 -> go there, and inside there, go to x64 folder
  • run the following command: regsvr32 /n /i:user .\Microsoft.Teams.AddinLoader.dll
  • Start your Outlook (classic) and create new item, and Teams meeting should be on the bottom of the list.
  • Hope this helped you a bit on your journey to wonderful world of Windows ARM.

    Posted in WindowsARM | Tagged , , , , | Leave a comment

    Creating Desired State Configuration with Azure Bicep

    Many times you’ll run into situation that you would want your installation package to install the app automatically when you deploy your VM to Azure. DSC is the right tool for that, and there’s plenty of samples with ARM templates how you can do that. When you turn to Bicep, you won’t find any complete samples, but you need to try to collect from crumbles of information the necessary pieces, and it will take some time to figure out how you can do that. As I went through that path, I thought it would be nice to put all that in one place for the next one who need to do the same thing.

    Create the DSC configuration

    First thing you need, is a valid DSC configuration. Let’s create one using 7zip installation package. Create a new file called MyDSC1.ps1 with the following content:

    Configuration InstallApp
    {
        Import-DscResource -ModuleName PsDesiredStateConfiguration
    
        Node 'localhost'
        {
            Package InstallApp
            {
                Ensure = 'Present'
                Name = '7-Zip 19.00 (x64 edition)'
                Path = 'https://www.7-zip.org/a/7z1900-x64.msi'
                ProductId = '23170F69-40C1-2702-1900-000001000000'
            }
        }
    }

    Things worth mentioning here are the name of the configuration, InstallApp which we need later, and ProductIdProductId is something you can get with PowerShell from installed apps on your computer, and the Name parameter needs to match also the name which is visible in Apps & Features -list of installed apps, so you can’t put anything random there. If you are not running MSI package but EXE file, you can leave the ProductId field empty.

    Azure doesn’t like ps1 files, but wants to have a zip archive instead. The recommended way to create it is to use PowerShell command Publish-AzVMDscConfiguration which is part of the Az.Compute -PowerShell module. If you don’t have that installed, run (and select A as your option when it asks):

    Install-Module -Name Az.Compute -AllowClobber

    Now that the module is installed (in case it wasn’t already), you can then create the zip archive with:

    Publish-AzVMDscConfiguration .\MyDSC1.ps1 -OutputArchivePath '.\MyDSC1.zip'

    Now that you have a zip archive including a ps1-file, you need to head to Azure portal, and create a Storage Account. After you have created one, open the Storage Account, under ‘Data Storage’, select ‘Containers’ and create new Container called ‘installers’ and upload the MyDSC1.zip to that container. Next open the file, and create SAS token for it, and copy the https link to notepad for later use.

    Create the Bicep for the VM

    Next we dig down to Bicep, and see how you insert VM extension to VM script. Lets see how the script would look and then dissect it a bit. Here we have a VM and just behind it we define the extension:

    resource vm 'Microsoft.Compute/virtualMachines@2021-03-01' = {
      name: vmName
    ...
    }
    resource extDSC 'Microsoft.Compute/virtualMachines/extensions@2021-03-01' = {
      parent: vm
      name: 'Microsoft.Powershell.DSC'
      location: 'westeurope'
      properties: {
        autoUpgradeMinorVersion: true
        publisher: 'Microsoft.Powershell'
        type: 'DSC'
        typeHandlerVersion: '2.83'
        settings: {
          ModulesUrl: '<YOUR SAS TOKEN HERE, eg- https://mydsctest2021.blob.${environment().suffixes.storage}/installers/MyDSC1.zip?sp...'
          ConfigurationFunction: 'MyDSC1.ps1\\InstallApp'
          WmfVersion: 'latest'
          Privacy: {
            DataCollection: 'Enable'
          }
        }
        protectedSettings: {}
      }
    }
    

    You notice that we tie the extension to the machine by setting the parent to point to the VM resource. This also handles the DependsOn behind the scenes. typeHandlerVersion is just the version of the Poweshell.DSC module we are using. ModulesUrl is where we set the SAS token you created, but do notice the ${environment().suffixes.storage} which replaces the blob storage url. This is one of the Bicep rules, you will notice a warning if you don’t replace that part of the SAS, saying you are not allowed to have that URL in Bicep. The ConfigurationFunction defines that we have a configuration file called MyDSC1.ps1 which includes Configuration called InstallApp inside it, and that is what should be executed.

    Now lets take a look of a complete Bicep file which you can use to test deploy VM and necessary other resources, with DSC which installs 7Zip:

    // Parameters
    param location string = 'westeurope'
    param adminUsername string = 'azureAdmin'
    @secure()
    param adminPassword string
    
    // Variables
    var vnetName = 'myVnet'
    var nsgName = 'myVM-nsg'
    var nsgId = resourceId(resourceGroup().name, 'Microsoft.Network/networkSecurityGroups', nsgName)
    var vnetId = resourceId(resourceGroup().name, 'Microsoft.Network/virtualNetworks', vnetName)
    var subnetName = 'default'
    var subnetRef = '${vnetId}/subnets/${subnetName}'
    
    var subnetPrefix = '10.0.0.0/26'
    var publicIpAddressName  = 'myVM-ip'
    var vmName = 'myVM'
    var addressPrefix  = [
      '10.0.0.0/25'
    ]
    
    // Resources
    resource nic 'Microsoft.Network/networkInterfaces@2018-10-01' = {
      name: 'myNIC'
      location: location
      properties: {
        ipConfigurations: [
          {
            name: 'ipconfig1'
            properties: {
              subnet: {
                id: subnetRef
              }
              privateIPAllocationMethod: 'Dynamic'
              publicIPAddress: {
                id: resourceId(resourceGroup().name, 'Microsoft.Network/publicIpAddresses', publicIpAddressName)
              }
            }
          }
        ]
        enableAcceleratedNetworking: true
        networkSecurityGroup: {
          id: nsgId
        }
      }
      dependsOn: [
        nsg
        vnet
        publicIP
      ]
    }
    
    resource nsg 'Microsoft.Network/networkSecurityGroups@2019-02-01' = {
      name: nsgName
      location: location
      properties: {
        securityRules: [
          {
            name: 'RDP'
            properties: {
              protocol: 'Tcp'
              sourcePortRange: '*'
              destinationPortRange: '3389'
              sourceAddressPrefix: '*'
              destinationAddressPrefix: '*'
              access: 'Allow'
              priority: 300
              direction: 'Inbound'
              sourcePortRanges: []
              destinationPortRanges: []
              sourceAddressPrefixes: []
              destinationAddressPrefixes: []
            }
          }
        ]
      }
    }
    
    resource vnet 'Microsoft.Network/virtualNetworks@2020-11-01' = {
      name: vnetName
      location: location
      properties: {
        addressSpace: {
          addressPrefixes: addressPrefix
        }
        subnets: [
          {
            name: subnetName
            properties: {
              addressPrefix: subnetPrefix
              networkSecurityGroup: {
                id: nsg.id
              }
            }
          }
        ]
      }
    }
    
    resource publicIP 'Microsoft.Network/publicIpAddresses@2019-02-01' = {
      name: publicIpAddressName
      location: location
      properties: {
        publicIPAllocationMethod: 'Dynamic'
      }
      sku: {
        name: 'Basic'
      }
    }
    
    resource vm 'Microsoft.Compute/virtualMachines@2021-03-01' = {
      name: vmName
      location: location
      properties: {
        hardwareProfile: {
          vmSize: 'Standard_DS1_v2'
        }
        storageProfile: {
          osDisk: {
            createOption: 'FromImage'
            managedDisk: {
              storageAccountType: 'StandardSSD_LRS'
            }
          }
          imageReference: {
            publisher: 'MicrosoftWindowsServer'
            offer: 'WindowsServer'
            sku: '2019-Datacenter'
            version: 'latest'
          }
        }
        networkProfile: {
          networkInterfaces: [
            {
              id: nic.id
            }
          ]
        }
        osProfile: {
          computerName: vmName
          adminUsername: adminUsername
          adminPassword: adminPassword
          windowsConfiguration: {
            enableAutomaticUpdates: false
            provisionVMAgent: true
            patchSettings: {
              enableHotpatching: false
              patchMode: 'Manual'
            }
          }
        }
        diagnosticsProfile: {
          bootDiagnostics: {
            enabled: true
          }
        }
      }
    }
    
    resource extDSC 'Microsoft.Compute/virtualMachines/extensions@2021-03-01' = {
      parent: vm
      name: 'Microsoft.Powershell.DSC'
      location: 'westeurope'
      properties: {
        autoUpgradeMinorVersion: true
        publisher: 'Microsoft.Powershell'
        type: 'DSC'
        typeHandlerVersion: '2.83'
        settings: {
          ModulesUrl: '<PUT YOUR SAS TOKEN HERE AND USE ${environment().suffixes.storage}>'
          ConfigurationFunction: 'MyDSC1.ps1\\InstallApp'
          WmfVersion: 'latest'
          Privacy: {
            DataCollection: 'Enable'
          }
        }
        protectedSettings: {}
      }
    }
    
    

    The above script is not production code, and leaves doors open for hackers, so do not use that in production, it is here just for testing purposes and keeping things as simple as possible. Save the code above as vm.bicep and run the following commands to deploy it:

    az login
    
    az group create --name myRG --location "West Europe" 
    
    az deployment group create --resource-group myRG --template-file vm.bicep
    

    Now you can go to Azure portal, connect to the machine with RDP, and see the 7Zip installed on the Start Menu. Not too complicated, when you just find all the pieces needed to make it work.

    Scale Set extensions

    If you are wondering how to do the same with Virtual Machine Scale Set, wonder no more! You can use the resource definition for the extension from the script above, you just need to move it to ExtensionProfile under the scale set virtualMachineProfile. You would define the extension as extension profile to the scale set, like this:

          extensionProfile: {
            extensions: [
              {
                name: 'Microsoft.Powershell.DSC'
                properties: {
                  autoUpgradeMinorVersion: true
                  publisher: 'Microsoft.Powershell'
                  type: 'DSC'
                  typeHandlerVersion: '2.83'
                  settings: {
                    ModulesUrl: '<PUT YOUR SAS TOKEN HERE AND USE ${environment().suffixes.storage}>' 
                    ConfigurationFunction: 'MyDSC1.ps1\\InstallApp'
                    Properties: ''
                    WmfVersion: 'latest'
                    Privacy: {
                      DataCollection: 'Enable'
                    }
                  }
                }
              }          
            ]
          }
    

    And if you are confused, how it would actually sit in the whole template, fear not, I have a full deployable scale set MVP (ports open to internet, so you can RDP to it, so not production ready!) here for you to try (do the az group create & az deployment using this file as parameter):

    param adminUsername string = 'azureAdmin'
    @secure()
    param adminPassword string
    param vmssName string = 'MyScaleSet'
    param location string = resourceGroup().location
    param windowsOSVersion string = '2019-Datacenter'
    
    var osType = {
      publisher: 'MicrosoftWindowsServer'
      offer: 'WindowsServer'
      sku: windowsOSVersion
      version: 'latest'
    }
    var addressPrefix = '10.0.0.0/25'
    var subnetPrefix = '10.0.0.0/26'
    var virtualNetworkName = 'myvnet'
    var publicIPAddressName = 'mypip'
    var subnetName = 'mysubnet'
    var loadBalancerName = 'mylb'
    var natPoolName = 'mynatpool'
    var bePoolName = 'mybepool'
    var natStartPort = 50000
    var natEndPort = 50119
    var natBackendPort = 3389
    var nicname = 'mynic'
    var ipConfigName = 'myipconfig'
    var nsgName='mynsg'
    
    resource vnet 'Microsoft.Network/virtualNetworks@2020-11-01' = {
      name: virtualNetworkName
      location: location
      properties: {
        addressSpace: {
          addressPrefixes: [
            addressPrefix
          ]
        }
        subnets: [
          {
            name: subnetName
            properties: {
              addressPrefix: subnetPrefix
              delegations: []
              privateEndpointNetworkPolicies: 'Enabled'
              privateLinkServiceNetworkPolicies: 'Enabled'
            }
          }
        ]
        virtualNetworkPeerings: []
        enableDdosProtection: false
      }
      dependsOn: [
        loadBalancer
        nsg
        publicIP
      ]
    }
    
    resource nsg 'Microsoft.Network/networkSecurityGroups@2020-11-01' = {
      name: nsgName
      location: 'westeurope'
      properties: {
        securityRules: [
          {
            name: 'RDP'
            properties: {
              protocol: 'Tcp'
              sourcePortRange: '*'
              destinationPortRange: '3389'
              sourceAddressPrefix: '*'
              destinationAddressPrefix: '*'
              access: 'Allow'
              priority: 300
              direction: 'Inbound'
              sourcePortRanges: []
              destinationPortRanges: []
              sourceAddressPrefixes: []
              destinationAddressPrefixes: []
            }
          }
        ]
      }
    }
    
    resource nsgRule 'Microsoft.Network/networkSecurityGroups/securityRules@2020-11-01' = {
      parent: nsg
      name: 'RDP'
      properties: {
        protocol: 'Tcp'
        sourcePortRange: '*'
        destinationPortRange: '3389'
        sourceAddressPrefix: '*'
        destinationAddressPrefix: '*'
        access: 'Allow'
        priority: 300
        direction: 'Inbound'
        sourcePortRanges: []
        destinationPortRanges: []
        sourceAddressPrefixes: []
        destinationAddressPrefixes: []
      }
    }
    
    resource publicIP 'Microsoft.Network/publicIPAddresses@2020-06-01' = {
      name: publicIPAddressName
      location: location
      sku: {
        name: 'Standard'
      }
      properties: {
        publicIPAddressVersion: 'IPv4'
        publicIPAllocationMethod: 'Static'
      }
    }
    
    resource loadBalancer 'Microsoft.Network/loadBalancers@2020-06-01' = {
      name: loadBalancerName
      location: location
      sku: {
        name: 'Standard'
      }
      properties: {
        frontendIPConfigurations: [
          {
            name: 'LoadBalancerFrontEnd'
            properties: {
              privateIPAllocationMethod: 'Dynamic'
              publicIPAddress: {
                id: publicIP.id
              }
            }
          }
        ]
        backendAddressPools: [
          {
            name: bePoolName
          }
        ]
        inboundNatPools: [
          {
            name: natPoolName
            properties: {
              frontendIPConfiguration: {
                id: resourceId('Microsoft.Network/loadBalancers/frontendIPConfigurations', loadBalancerName, 'LoadBalancerFrontEnd')
              }
              protocol: 'Tcp'
              enableFloatingIP: false
              enableTcpReset: false
              frontendPortRangeStart: natStartPort
              frontendPortRangeEnd: natEndPort
              backendPort: natBackendPort
            }
          }
        ]
        probes: [
          {
            name: 'tcpProbe'
            properties: {
              protocol: 'Tcp'
              port: 80
              intervalInSeconds: 5
              numberOfProbes: 2
            }
          }
        ]    
        loadBalancingRules: [
          {
            name: 'LBRule'
            properties: {
              frontendIPConfiguration: {
                id: resourceId('Microsoft.Network/loadBalancers/frontendIPConfigurations', loadBalancerName, 'LoadBalancerFrontEnd')
              }
              backendAddressPool: {
                id: resourceId('Microsoft.Network/loadBalancers/backendAddressPools', loadBalancerName, '${bePoolName}')
              }
              protocol: 'Tcp'
              frontendPort: 80
              backendPort: 80
              enableFloatingIP: false
              idleTimeoutInMinutes: 5
              enableTcpReset: false
              disableOutboundSnat: false
              loadDistribution: 'Default'
              probe: {
                id: resourceId('Microsoft.Network/loadBalancers/probes', loadBalancerName, 'tcpProbe')
              }
            }
          }
        ]
      }
    }
    
    // VM Scale set
    resource vmss 'Microsoft.Compute/virtualMachineScaleSets@2020-06-01' = {
      name: vmssName
      location: location
      sku: {
        name: 'Standard_DS1_v2'
        tier: 'Standard'
        capacity: 2
      }
      properties: {
        overprovision: true
        upgradePolicy: {
          mode: 'Manual'
          automaticOSUpgradePolicy: {
            enableAutomaticOSUpgrade: false
          }
        }
        virtualMachineProfile: {
          storageProfile: {
            osDisk: {
              createOption: 'FromImage'
              caching: 'ReadWrite'
            }
            imageReference: osType
          }
          osProfile: {
            computerNamePrefix: 'my'
            adminUsername: adminUsername
            adminPassword: adminPassword
          }
          networkProfile: {
            networkInterfaceConfigurations: [
              {
                name: nicname
                properties: {
                  primary: true
                  enableAcceleratedNetworking: true
                  networkSecurityGroup: {
                    id: nsg.id
                  }
                  dnsSettings: {
                    dnsServers: []
                  }
                  enableIPForwarding: false              
                  ipConfigurations: [
                    {
                      name: ipConfigName
                      properties: {
                        publicIPAddressConfiguration: {
                          name: 'publicIp-my-vnet-nic01'
                          properties: {
                            idleTimeoutInMinutes: 15
                            ipTags: []
                            publicIPAddressVersion: 'IPv4'
                          }
                        }
                        primary: true
                        subnet: {
                          id: '${vnet.id}/subnets/${subnetName}'
                        }
                        loadBalancerBackendAddressPools: [ 
                          {
                              id: '/subscriptions/${subscription().subscriptionId}/resourceGroups/${resourceGroup().name}/providers/Microsoft.Network/loadBalancers/${loadBalancerName}/backendAddressPools/${bePoolName}'
                          }
                        ]
                        loadBalancerInboundNatPools: [ 
                          {
                            id: '/subscriptions/${subscription().subscriptionId}/resourceGroups/${resourceGroup().name}/providers/Microsoft.Network/loadBalancers/${loadBalancerName}/inboundNatPools/${natPoolName}'
                          }
                        ]
                      }
                    }
                  ]
                }
              }
            ]
          }
          extensionProfile: {
            extensions: [
              {
                name: 'Microsoft.Powershell.DSC'
                properties: {
                  autoUpgradeMinorVersion: true
                  publisher: 'Microsoft.Powershell'
                  type: 'DSC'
                  typeHandlerVersion: '2.83'
                  settings: {
                    ModulesUrl: '<YOUR SAS TOKEN HERE>' 
                    ConfigurationFunction: 'MyDSC1.ps1\\InstallApp'
                    Properties: ''
                    WmfVersion: 'latest'
                    Privacy: {
                      DataCollection: 'Enable'
                    }
                  }
                }
              }          
            ]
          }
        }
      }
    }
    Posted in Azure | Tagged , , , , , , , | Leave a comment

    Connect Arduino to Raspberry via USB to read serial data

    Okay, first of all, this thing has been made way too complicated in the official samples, where they require you to get USB to TTL cables or similar, when all you really need is the product id and vendor id, which you can easily check and hard code on your project.

    So, if you are doing some cool IOT project, and have Arduino reading sensors and writing to serial port the results, this is what you need on the Raspberry side to make the magic work:

    You need to include two namespaces

    using Windows.Devices.SerialCommunication;
    using Windows.Devices.Enumeration;

    What I did next was based on hint from article by Bereket Godebo, INTERFACING ARDUINO IN THE UNIVERSAL WINDOWS PLATFORM , which is in C++/CX, but I found out that when you connect the Arduino first to normal PC, go to the device manager and from the ports section you open Arduino, go to Details -section, select Hardware Id from the Property combo, and there you have a string which looks something like USB\\VID_2341&PID_0043… where you take the vid and pid and input them as hexa values to the following code:

    var aqs = SerialDevice.GetDeviceSelectorFromUsbVidPid(0x2341, 0x0043);
    var devices = await DeviceInformation.FindAllAsync(aqs);

    First parameter is the vendor id, second is product id. The ids used above are for Arduino Uno, so if you have one, that code should work as it is.

    Now the rest of the code needed to read the serial is only as follows:

    // member variables
    private SerialDevice serialPort;
    private DataReader reader;
    private CancellationTokenSource ReadCancellationTokenSource;
    

    Put this in your method, where you want to do the reading, as it will find the the arduino, and create serial connection to it, and create loop to call for reading the input:

    var aqs = SerialDevice.GetDeviceSelectorFromUsbVidPid(0x2341, 0x0043);
    var devices = await DeviceInformation.FindAllAsync(aqs);
                
    if (devices.Count > 0)
    {
        Debug.WriteLine("Serial devices found...");
        DeviceInformation deviceInfo = devices[0];
        serialPort = await SerialDevice.FromIdAsync(deviceInfo.Id);
        serialPort.ReadTimeout = TimeSpan.FromMilliseconds(1000);
        serialPort.BaudRate = 9600;
        serialPort.Parity = SerialParity.None;
        serialPort.DataBits = 8;
        serialPort.StopBits = SerialStopBitCount.One;
    
        try
        {
            if (serialPort != null)
            {
                reader = new DataReader(serialPort.InputStream);
                Debug.WriteLine("Reading serial port COM");
    
                // Read the serial input in loop
                while (true)
                {
                    await ReadAsync(ReadCancellationTokenSource.Token);
                }
            }
        }
        catch (TaskCanceledException tex)
        {
            Debug.WriteLine("Reading was cancelled");
    
            serialPort.Dispose();
        }
        catch (Exception ex)
        {
            Debug.WriteLine(ex.Message);
        }
        finally
        {
            if (reader != null)
            {
                reader.DetachStream();
                reader = null;
            }
        }
    }

    The actual reading of the serial port happens in this method:

    private async Task ReadAsync(CancellationToken cancellationToken)
    {
        Task<UInt32> loadAsyncTask;
        uint ReadBufferLength = 1024;
    
        // Cancel in case it was requested 
        cancellationToken.ThrowIfCancellationRequested();
        reader.InputStreamOptions = InputStreamOptions.Partial;
        loadAsyncTask = reader.LoadAsync(ReadBufferLength).AsTask(cancellationToken);
    
        // Launch the task and wait 
        UInt32 bytesRead = await loadAsyncTask;
        if (bytesRead > 0)
        {
            string txt = reader.ReadString(bytesRead);
            Debug.WriteLine(txt);
        }
    } 
    

    That’s all needed to connect Raspberry to read Arduino through normal USB cable plugged into Arduino, connected to Raspberry.

    Posted in Uncategorized | Leave a comment

    Using Cognitive Services with Xamarin Forms

    I did a talk at DotNext conference about this and thought I share the demo here as well. In this demo we’ll create simple Xamarin Forms app for Android and UWP, camera and add the Cognitive Services Emotion and Computer Vision APIs to analyze the photo taken by the camera.

    Setup

    It all starts with File, New Project… and selecting from under Cross Platform (in Visual Studio 2015) project type Blank App (Xamarin.Forms Portable). Once you have created the project, you can remove the other projects, but leaveyourapp, yourapp.droid and yourapp.UWP for this demo.

    Next you’ll need to add three Nuget -packages by selecting the Solution from the Solution Explorer and right clicking it and Manage Nuget packages for solution…. Go to Browse tab, and type Microsoft.ProjectOxford and hit enter.

    Now you are presented with a list of all Cognitive Services Nuget packages, but you need to select only two from the list. First select Microsoft.ProjectOxford.Emotion and from the right side panel, click only the main project, not the yourapp.Droid or yourapp.UWP projects. Now select from the list Microsoft.ProjectOxford.Vision and repeat the previous selection.

    We need to have the camera access, so we’ll add one more Nuget package, Xam.Plugin.Media (type that to Browse search box like previous) but this time you need to install this to all projects, yourapp, yourapp.droid and yourapp.UWP.

    We need to do one more step inside the Nuget package manager. Go to updates tab, empty the field where you typed the package names and find the Xamarin.Forms from the list and update it to the latest version (if you don’t empty the search box, you will not see anything on updates list).

    Now we need to ensure that security requirements are in place, you need to open Package.appxmanifest in the yourapp.UWP project and from the Capabilities -tab select Pictures Library and Webcam and save it. There’s one last step we need to do for configuration before we’re ready start coding, and that is to include the UWP version to build (it’s not by default, which is weird!). On the toolbar select the Debug dropdown and Configuration Manager from the list. Make sure that Build and Deploy options are selected for yourapp.UWP project.

    Next step is to get the keys to use to authenticate your app with Cognitive Services. Open the link Cognitive services subscription and login with your Microsoft account to request keys for Computer Vision and Emotion.

    Code

    We’ll implement a simple UI with a button to snap a photo and send it to Cognitive services and image to show the image from the camera. We’ll implement all this code within the app.cs so it will run on all platforms without need for any platform specific code. For the first step we’ll add the required using statements to the beginning of the file:

    using Microsoft.ProjectOxford.Emotion;
    using Microsoft.ProjectOxford.Vision;
    using Plugin.Media;
    using System.Diagnostics;
    

    Next we’ll add the following member variables inside your app -class:

    string key = "YOUR_EMOTION_API_KEY", key2 = "YOUR_COMPUTER_VISION_KEY";

    Now copy your new key values inside to those variables from the webpage and you’re all set to start calling Cognitive services.

    Then we need to add the following member variables under the key -variable that you just added in previous step:

    Image image = new Image();
    StackLayout layout;

    The image is for showing the camera photo and layout is the forms base layout control, which will hold all the controls for the app.

    Inside the app.cs go to the public App() -constructor and delete everything else but leave only the

    MainPage = new NavigationPage(content);

    Now we add the UI for the app just above that Mainpage -line like this:

    // The root page of your application
    layout = new StackLayout
    {
        BackgroundColor = Xamarin.Forms.Color.Gray
    };
    var button = new Button
    {
        HorizontalOptions = LayoutOptions.Center,
        Text = "photo"
    };
    button.Clicked += Button_Clicked;
    layout.Children.Add(button);
    layout.Children.Add(image);
    var content = new ContentPage
    {
        Content = layout
    };
    

    We set the background color ourselves as Android is using by default dark theme and UWP light theme, so the text will be visible. Next we create a button in the middle of the screen and add event handler for it, and lastly we add image and button controls to the screen layout.

    Now we are ready to take the photo, and call the cognitive services with our photo. We’ll add the button click handler where we do all this by adding this method:

    private async void Button_Clicked(object sender, EventArgs e)
    {
        // Camera magic
        await CrossMedia.Current.Initialize();
    
        if (!CrossMedia.Current.IsCameraAvailable || !CrossMedia.Current.IsTakePhotoSupported)
        {
            return;
        }
    
    var mediaOptions = new Plugin.Media.Abstractions.StoreCameraMediaOptions
    {
        CompressionQuality = 100,
        PhotoSize = PhotoSize.Full,
        Directory = "Faces",
        Name = $"{DateTime.UtcNow}.jpg"
    };
    var file = await CrossMedia.Current.TakePhotoAsync(mediaOptions);
    }

    The first line is initializing the camera function, then we’ll check there is a camera on the device which can take a photo before continuing. The camera options are set so it knows where to save the images and using what filename (we use current date+time). Last line opens the platform specific camera launcher and returns the file from there which includes the photo which the user took.

    Now we’re ready to send this image for analysis on the cloud. Add the following code in the end of Button_Clicked -method:

    if (file != null)
    {
        image.Source = ImageSource.FromFile(file.Path);
        var emotionClient = new EmotionServiceClient(key);
        var imageStream = file.GetStream();
    
        // Emotion API
        var emotion = await emotionClient.RecognizeAsync(imageStream);
        Debug.WriteLine($"Anger: {emotion[0].Scores.Anger : #0.##%}");
        Debug.WriteLine($"Contempt: {emotion[0].Scores.Contempt: #0.##%}");
        Debug.WriteLine($"Disgust: {emotion[0].Scores.Disgust: #0.##%}");
        Debug.WriteLine($"Fear: {emotion[0].Scores.Fear: #0.##%}");
        Debug.WriteLine($"Happiness: {emotion[0].Scores.Happiness: #0.##%}");
        Debug.WriteLine($"Neutral: {emotion[0].Scores.Neutral: #0.##%}");
        Debug.WriteLine($"Sadness: {emotion[0].Scores.Sadness: #0.##%}");
        Debug.WriteLine($"Surprise: {emotion[0].Scores.Surprise: #0.##%}");
    
        // Computer Vision
        var imageStream2 = file.GetStream();
        var visionClient = new VisionServiceClient(key2);
        VisualFeature[] visualFeats = new VisualFeature[]
        {
            VisualFeature.Description,
            VisualFeature.Faces
        };
    
        var analysis = await visionClient.AnalyzeImageAsync(imageStream2, visualFeats);
        Debug.WriteLine($"{analysis.Faces[0].Gender}, age {analysis.Faces[0].Age}");
        Debug.WriteLine(analysis.Description.Captions[0].Text);
        foreach (var tag in analysis.Description.Tags)
            Debug.WriteLine(tag);
    }
    

    On the code above we’ll set first the image control to show the image we captured. For the next step we use the key we requested earlier to open connection to the Emotion api service in the cloud, and then get stream of the image which we send to the API for analysis. Next you should check in real code that you actually got results, but here we assume it worked and just dump output to the console, formatting the results in percentage numbers.

    In the Computer Vision -section of the code we need to get another stream as the previous call closed the earlier stream, and then set the visual features which we want to analyze from the picture. There are more options available, but more you ask, the slower it is, so we just want to get Description of the image and faces of the image this time. We call the AnalyzeImageSync with the file stream and visual feature request and again assume we get results, and dump them to the output console inside Visual Studio.

    That was pretty easy to implement, but was a bit tricky for a beginner so I thought this would be useful guide for someone. Hope you enjoyed and let me know if you did or didn’t!

    Posted in Uncategorized | Leave a comment

    Connect SensorTag CC2650 to Azure IOT Hub

    I needed to create a demo showing the most simple way to get some sensor data to IOT Hub and I thought sharing the code might be helpful to someone else as well. This app is divided to two parts, first part handling the sensor and second part connecting to IOT Hub.

    First we start by creating a new empty UWP app, you can give whatever name you want to it. Because Texas Instrument SensorTag is a bluetooth device, we’ll first initialize the bluetooth connection bits (please note that you need to manually pair the Sensortag in the Windows 10 settings, we’re only dealing here with the code which handles paired devices).

    First open the Package.appxmanifest, and in Capabilities tab tick the Bluetooth. Open your Mainpage.xaml.cs file and add the following namespaces:

    using Windows.Devices.Bluetooth;
    using Windows.Devices.Enumeration;
    using Windows.Devices.Bluetooth.GenericAttributeProfile;
    using Windows.Storage.Streams;
    using Windows.ApplicationModel.Core

    We will be needing couple of class members, so let’s add this now:

    private BluetoothLEDevice tagDevice;
    private GattDeviceService gattDevice;

    Next we add the code to find the paired SensorTag device. I placed this code in Mainpage_Loaded override, but you can find more suitable place for it in your code:

    foreach (var info in await DeviceInformation.FindAllAsync(BluetoothLEDevice.GetDeviceSelector()))
    {
        BluetoothLEDevice bleDevice = await BluetoothLEDevice.FromIdAsync(info.Id);
        if (bleDevice.Name.ToLower().Contains("sensortag"))
        {
            tagDevice = bleDevice;
            foreach (var serv in tagDevice.GattServices)
            {
                if (serv.Uuid.ToString() == "f000aa00-0451-4000-b000-000000000000")
                {
                    gattDevice = serv;
                    break;
                }
            }
            break;
        }
    }
    

    The code above first goes through all the BLE devices connected, and then checks for each of them if they have “sensortag” somewhere in their name. This probably finds also first gen sensortags, but you can always make it more specific if needed. After that we’re looking for a certain service with GattService id, which in this case is id for temperature sensor. Sensortag supports other ids/sensors as well, so you can always add more servies, and make a list of the gattDevice instead.

    Next step is to start reading the Sensortag itself, and it can be done with the following code:

    if (tagDevice != null && gattDevice != null)
    {
        // Test connection
        Guid configuration = Guid.Parse("f000aa02-0451-4000-b000-000000000000"); // temperature profile
        var configurationCharacteristics = gattDevice.GetCharacteristics(configuration).First();
    
        GattCommunicationStatus status = await configurationCharacteristics.WriteValueAsync((new byte[] { 1 }).AsBuffer());
    
        if (status != GattCommunicationStatus.Success)
        {
            Debug.WriteLine("Connection failed, ensure the SensorTag is powered on");
        }
        else
        {
            // Setup the data read
            Guid data = Guid.Parse("f000aa01-0451-4000-b000-000000000000"); // temperature data
            GattCharacteristic dataCharacteristic = gattDevice.GetCharacteristics(data).First();
            if (dataCharacteristic.CharacteristicProperties == 
                (GattCharacteristicProperties.Read | 
                GattCharacteristicProperties.Notify))
            {
                dataCharacteristic.ValueChanged += this.dataValueChanged;
                status = await dataCharacteristic.WriteClientCharacteristicConfigurationDescriptorAsync(
                    GattClientCharacteristicConfigurationDescriptorValue.Notify);
    
                if (status != GattCommunicationStatus.Success)
                {
                    Debug.WriteLine("Sensor access denied");
                }
            }
        }
    }

    The codeblock above first looks for the temperature service from the gatt device, and tests if the connection can be established and if everything is ok and we can read and get a notify for the data change, we’ll setup a callback function for temperature data. Yet again you can setup multiple callbacks for different sensors by using the GetCharacteristics with different sensor data guids. All that is left is to do the callback method now, which is like this:

    private async void dataValueChanged(GattCharacteristic sender, GattValueChangedEventArgs args)
    {
        GattReadResult readResult = await sender.ReadValueAsync(BluetoothCacheMode.Uncached);
        Color textColor;
    
        if (readResult.Status == GattCommunicationStatus.Unreachable)
        {
            Debug.WriteLine("Connection lost");
        }
    
        // Read the sensor value
        var result = new byte[readResult.Value.Length];
        DataReader.FromBuffer(readResult.Value).ReadBytes(result);
    
        // Extract ambiant temperature
        double ambTemp = BitConverter.ToUInt16(result, 2) / 128.0;
    
        string data = $"{ambTemp}";
    
        // Code to send data to IOT Hub
        //TagSensor sensor = new TagSensor() {Temperature = ambTemp};
        //await SendEvent(sensor);
    
        await CoreApplication.MainView.CoreWindow.Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal,
                () =>
                {
                    try
                    {
                        // This is needed in case you want to update UI with sensor data, temperatureText is a TextBlock
                        //temperatureText.Text = data;
                    }
                    catch (Exception ex) { }
                });
                
        Debug.WriteLine(data);
    
    }

    At this point we are able to read the SensorTag temperature values, and get callback for the changes. Next we’ll add the code required to send this to IOT Hub.

    First you should create new IOT Hub in Azure portal, and take note of the Connection String – Primary key (can be found in Shared Access Policies, iothubowner), and copy the value to the connectionString -member variables of the MainPage.xaml.cs and the host name to connectionString2:

    // IOT Hub
    static RegistryManager registryManager;
    static string connectionString = "HostName=XXXXX.azure-devices.net;SharedAccessKeyName=iothubowner;SharedAccessKey=XXXXXXXXXXXXXXXXXXXXXXXXX";
    static string connectionString2 = "XXXXX.azure-devices.net";
    static string deviceKey;
    private DeviceClient deviceClient;
    

    Add the following Nuget packages to your solution:
    – Microsoft.Azure.Devices
    – Microsoft.Azure.Devices.Client
    and in case it didn’t add automatically, Newtonsoft.JSon as well.

    Next you should add the namespaces as well:

    using Microsoft.Azure.Devices.Client;
    using Microsoft.Azure.Devices;
    using Microsoft.Azure.Devices.Common.Exceptions;
    using System.Runtime.Serialization.Json;
    using System.Threading.Tasks;

    Use the latest stable versions of them.

    For the device to be able to connect to the IOT Hub you first need to setup the device in the device registry. That can be done with the following code:

    private async Task AddDeviceAsync()
    {
        registryManager = RegistryManager.CreateFromConnectionString(connectionString);
    
        string deviceId = "TiTag1";
        Device device = null;
        try
        {
            device = await registryManager.AddDeviceAsync(new Device(deviceId));
        }
        catch (DeviceAlreadyExistsException)
        {
            device = await registryManager.GetDeviceAsync(deviceId);
        }
        catch (Exception ex)
        {
            Debug.WriteLine(ex.Message);
        }
        if (device != null)
            deviceKey = device.Authentication.SymmetricKey.PrimaryKey;
    
    }
    

    The device id can be anything you want to give it, in this demo I just simply call it TiTag1.

    Next add the call to the method in the beginning of the Mainpage_Loaded (remember to change it to private async void Mainpage_Loaded to be able to call async code inside it):

    // Create the IoT Hub Device Client instance
    await AddDeviceAsync();
    deviceClient = DeviceClient.Create(connectionString2, new DeviceAuthenticationWithRegistrySymmetricKey("TiTag1", deviceKey));

    Now is a good time to add a class to represent the sensor, where you can add the other sensors later if you like. Add this before Mainpage class:

    public class TagSensor
    {
        public TagSensor()
        {
            Time = DateTime.Now;
        }
    
        public DateTime Time;
        public double Temperature;
    }
    

    Next we’ll need to serialize the date to Json, and send to IOT Hub, which can be done like this:

    private async Task SendEvent(TagSensor sensor)
    {
        var serializer = new DataContractJsonSerializer(typeof(TagSensor));
        var stream = new MemoryStream();
        serializer.WriteObject(stream, sensor);
        string json = Encoding.UTF8.GetString(stream.ToArray(), 0, (int)stream.Length);
    
        var eventMessage = new Microsoft.Azure.Devices.Client.Message(Encoding.UTF8.GetBytes(json));
        await deviceClient.SendEventAsync(eventMessage);
    }
    

    Last piece is to remove the comments in dataValueChanged -method from TiTag Sensor… and async SendEvent -lines and you should be good to go. Please let me know in the comments if you are having issues, always happy to help!

    Posted in Uncategorized | Leave a comment

    Add Windows Hello -sign in to your app

    With Windows 10 you have option to use Biometric methods to login, but it looks like there is quite little documentation how to implement it. Another thing is that people confuse Windows Hello to authentication framework which it’s not, even you can use it with such to authenticate. So this article shows you how to implement the login functionality to your UWP app using Microsoft Passport and Windows Hello.

    It’s important to understand that this will use what ever is available, it could be iris recognizion on for example on Lumia 950 or fingerprint on your Thinkpad or just pin code if the machine doesn’t have any biometric sensors. The code itself is actually very simple, and this is all you would need:

    Add the required reference to app.xaml.cs

    using Windows.Security.Credentials;
    using Windows.Security.Cryptography;

    After that we can implement the login in App.xaml.cs and you can do this in many different ways, but I have a static member variable here:

    private static bool authorized = false;

    Now all is left is to do the actual login, which you can copy/paste to your code (beginning of OnLaunched -method):

    // Do we have capability to provide credentials from the device
    if (await KeyCredentialManager.IsSupportedAsync())
    {
        // Get credentials for current user and app
        KeyCredentialRetrievalResult result = await KeyCredentialManager.OpenAsync("MyAppCredentials");
        if (result.Credential != null)
        {
            KeyCredentialOperationResult signResult =
                await
                    result.Credential.RequestSignAsync(CryptographicBuffer.ConvertStringToBinary("LoginAuth",
                        BinaryStringEncoding.Utf8));
            if (signResult.Status == KeyCredentialStatus.Success)
            {
                authorized = true;
            }
        }
        // No previous saved credentials found
        else
        {
            KeyCredentialRetrievalResult creationResult =
                await
                    KeyCredentialManager.RequestCreateAsync("MyAppCredentials",
                        KeyCredentialCreationOption.ReplaceExisting);
            if (creationResult.Status == KeyCredentialStatus.Success)
            {
                authorized = true;
            }
        }
    }

    When you check IsSupportedAsync you need to handle the situation that the device is not capable to provide this service, and you have to fallback to something else, such as Facebook or Twitter authentication. OpenAsync will check it there are saved credentials per app and user, and use those if can be found with RequestSignAsync. If there were no previous credentials for the app for current user, let’s create one. That’s all there is to it, very confusing topic but it is actually surprisingly easy to use. Hope this helps you!

    Posted in Uncategorized | Leave a comment

    Figuring out your if your app is run on Phone, Tablet or Desktop

    One would think that it’s easy to figure out the answer to header above. If you’ve ever tried that, I think you most likely have become extremely frustrated when you realized that there are just too many combinations and it becomes quite impossible to make a difference between the devices with the provided APIs. For example Surface is a tablet running Desktop SKU while under 8″ screen tablets run Mobile SKU which is same SKU which is on the phones. For IoT there’s IoT Core and IoT Enterprise, which is actually Windows 10 Enterprise with additional IoT license.

    In general you should always design for Universal which adapts to available screen space and hardware/capabilities but sometimes you just need to know (for example for analytic reasons) and that’s why I have created this little snippet. It should cover the most cases, but do let me know if the logic fails to identify something and I try to include that as well.

    I threw in my best guess for Continuum and Iot as well, after some Twitter discussions with @dotMorten, @ScottIsAFool, @gcaughey and @mahoekst

    public enum Device
    {
        Phone,
        Tablet,
        Desktop,
        Xbox,
        Iot,
        Continuum
    };
    ...
    if (Windows.System.Profile.AnalyticsInfo.VersionInfo.DeviceFamily == "Windows.Mobile")
    {
        if (ApiInformation.IsApiContractPresent("Windows.Phone.PhoneContract", 1))
        {
            Windows.Devices.Input.KeyboardCapabilities keyboard = new Windows.Devices.Input.KeyboardCapabilities();
            if (keyboard.KeyboardPresent > 0)
            {
                runningDevice = Device.Continuum;
            }
            else
            {
                runningDevice = Device.Phone;
            }
        }
        else
        {
            runningDevice = Device.Tablet;
        }
    }
    else if (Windows.System.Profile.AnalyticsInfo.VersionInfo.DeviceFamily == "Windows.Desktop")
    {
        Windows.Devices.Input.KeyboardCapabilities keyboard = new Windows.Devices.Input.KeyboardCapabilities();
        if (keyboard.KeyboardPresent > 0)
        {
            runningDevice = Device.Desktop;
        }
        else
        {
            runningDevice = Device.Tablet;
        }
    }
    else if (Windows.System.Profile.AnalyticsInfo.VersionInfo.DeviceFamily == "Windows.Xbox")
    {
        runningDevice = Device.Xbox;
    }
    else if (Windows.System.Profile.AnalyticsInfo.VersionInfo.DeviceFamily == "Windows.IoT")
    {
        runningDevice = Device.Iot;
    }
    
    // Couldn't figure out, let's assume it's desktop (safest bet)
    runningDevice = Device.Desktop;
    Posted in Uncategorized | Leave a comment

    How I do Hamburger menu on SplitView, quick and dirty

    I wanted to do a nice hamburger menu on my app, but it seems that all the samples found are doing it in different way compared how the default apps on Windows 10 seem to implement it. I don’t want to have CompactInlay menu always visible on the side of the screen, but just the hamburger button which would open it when clicked. This is how I implemented mine, hope you can get something out of it:

    First I created BoolToVisibilityConverter class, so I can later on bind the visibility of the menu to the hamburger button (right click on project, add new item, class):

    public class BoolToVisibilityConverter : IValueConverter
    {
        public object Convert(object value, Type targetType, object parameter, string language)
        {
            return (bool)value ? Visibility.Visible : Visibility.Collapsed;
        }
    
        public object ConvertBack(object value, Type targetType, object parameter, string language)
        {
            throw new NotImplementedException();
        }
    }

    Then I open the MainPage.xaml and add the reference to the namespace in the page element after xmlns:local -line:

    xmlns:utils="using:MyNameSpace.Utils"

    where the MyNameSpace.Utils is the namespace where you created the BoolToVisibilityConverter -class in previous step.

    After that, you would need to add in the resources the following:

    <Page.Resources>
            <utils:BoolToVisibilityConverter x:Name="BoolToVisibilityConverter" />
        </Page.Resources>

    Now that we have all the prerequisities in place, we can build the page itself. This is the barebone structure for the main page with hamburger menu and splitview:

    <Grid Background="#FFFF8000">
        <Grid.RowDefinitions>
            <RowDefinition Height="50"/>
            <RowDefinition/>
        </Grid.RowDefinitions>
        <!--Menu bar on top-->
        <RelativePanel Grid.Row="0" >
            <ToggleButton x:Name="HamburgerButton" FontFamily="Segoe MDL2 Assets" Content="?" Width="50" Height="50" Background="Transparent" IsChecked="False"/>
            <TextBlock Text="My title" FontSize="32" Width="300" Height="40" Foreground="Black" RelativePanel.RightOf="HamburgerButton"/>
        </RelativePanel>
        <!--Hamburger menu-->
        <SplitView Grid.Row="1" x:Name="MySplitView" PanePlacement="Left" CompactPaneLength="50" OpenPaneLength="300" 
                IsPaneOpen="{Binding IsChecked, ElementName=HamburgerButton}" DisplayMode="Overlay" >
            <SplitView.Pane>
                <StackPanel Orientation="Vertical">
                    <StackPanel x:Name="SettingsPanel" Orientation="Horizontal">
                        <Button x:Name="SettingsButton" FontFamily="Segoe MDL2 Assets" Content="?" Width="50" Height="50"/>
                        <TextBlock Text="Settings" FontSize="18" VerticalAlignment="Center" />
                    </StackPanel>
                    <StackPanel x:Name="AboutPanel" Orientation="Horizontal">
                        <Button x:Name="AboutButton" FontFamily="Segoe MDL2 Assets" Content="&#xE77B" Width="50" Height="50"/>
                        <TextBlock Text="About" FontSize="18" VerticalAlignment="Center" />
                    </StackPanel>
                </StackPanel>
            </SplitView.Pane>
            <!--Main content-->
            <Grid Grid.Row="1">
                
            </Grid>
        </SplitView>
    </Grid>
    
    

    On the code above, the RelativePanel section is for the menu bar, where the button and title are. In the SplitView section there is the SplitView.Pane where the actual menu is built, each item consists of vertical StackPanelButton and TextBlock. Inside then there is the Grid where the actual page content can be put. That’s all required for simple hamburger menu, hope it’s helpful!

    Posted in Uncategorized | Leave a comment

    How to make a Windows Store game with C# and XAML, part 5

    In this mini post we’ll be finally adding some sounds to our game. You will need to have some sound effects added to your Assets -folder. You need one for laser shooting and one for enemy hit sound. One place you could download some of these for free is Free game content resources. Add one with the name pew.mp3 under Assets (right click, add existing..) and one with the name pow.mp3

    Windows 10 finally brings an easy way to add low latency sound to your universal app with managed classes, without need to do interop with DirectX. The main class is AudioGraph and it is good replacement for XAudio2.

    We nee to edit GamePage.xaml.cs and add first three using statements, first of them is not needed for the sound but just to get some debug output in case something goes wrong with the initialization of the audio.

    using Windows.Media.Audio;
    using System.Diagnostics;
    using Windows.Storage;
    using Windows.Media.Render;

    Now we need to add the member variables which will be used to play the audio, and store the audio files in the memory. AudioGraph class is for the audio playing, AudioDeviceOutputNode is the output device where the sound is played and the AudioFileInputNode has the sound file. These go as member variables to GamePage class, add them under the bool fireSuppressor:

    private AudioGraph graph;
    private AudioDeviceOutputNode deviceOutput;
    private Dictionary<string, AudioFileInputNode> fileInputs = new Dictionary<string, AudioFileInputNode>();

    In the Loaded we need to add the the following:

    InitSound();

    Next we need to add the method itself, which will create the audio pipeline, load the sounds and enable us to play them later on:

    private async System.Threading.Tasks.Task InitSound()
    {
        AudioGraphSettings settings = new AudioGraphSettings(AudioRenderCategory.Media);
        CreateAudioGraphResult result = await AudioGraph.CreateAsync(settings);
    
        if (result.Status == AudioGraphCreationStatus.Success)
        {
            graph = result.Graph;
    
            // Create the output device for audio playing
            CreateAudioDeviceOutputNodeResult deviceOutputNodeResult = await graph.CreateDeviceOutputNodeAsync();
    
            // Ensure there's audio output ready and available
            if (deviceOutputNodeResult.Status == AudioDeviceNodeCreationStatus.Success)
            {
                deviceOutput = deviceOutputNodeResult.DeviceOutputNode;
                graph.ResetAllNodes();
    
                await AddFileToSounds("ms-appx:///Assets/pew.mp3");
                await AddFileToSounds("ms-appx:///Assets/Pop.mp3");
                // here you could list all your sounds like row above
    
                graph.Start();
            }
        }
    }
    

    Add this method, which adds a given sample to memory for playing later on:

    /// <summary>
    /// Load and add resource sound file to memory dictionary for playing
    /// </summary>
    /// <returns></return>
    private async System.Threading.Tasks.Task AddFileToSounds(string uri)
    {
        var soundFile = await StorageFile.GetFileFromApplicationUriAsync(new Uri(uri));
        CreateAudioFileInputNodeResult fileInputResult = await graph.CreateFileInputNodeAsync(soundFile);
    
        if (AudioFileNodeCreationStatus.Success == fileInputResult.Status)
        {
            fileInputs.Add(soundFile.Name, fileInputResult.FileInputNode);
            fileInputResult.FileInputNode.Stop();
            fileInputResult.FileInputNode.AddOutgoingConnection(deviceOutput);
        }
    }

    That is all what is needed for setting up playing sounds. Next job for us is to hook the sound playing in the correct places in the code and play the actual sounds. Go to OnFire methos and add the following under bullets.Add(bullet); :

    // Play the sound
    var pew = fileInputs["pew.mp3"];
    pew.Reset();
    pew.Start();
    

    The previous code is code to play the shooting sound, now we add the explosion sound for hitting enemy. Go to the HitTest and add under the enemies[i].Dead = true; the following code:

    // Play the sound
    var pop = fileInputs["Pop.mp3"];
    pop.Reset();
    pop.Start();

    That’s all, now just compile and run the game, and enjoy your sound effects on the game! On the next blog post we’ll be talking about scaling the app on different screens properly, doing the Windows 10 adaptive screen magic.

    Posted in Uncategorized | Leave a comment