In my previous post I showed you how to set up an application in Azure AD and allow Azure AD users to access it. In this post I will show how you can give access to these applications to users outside of your organisation using B2B (Business to Business) as guest users.
B2B is a feature of Azure AD that allows you to easily add two types of user to your applications.
- Users who are part of another Azure AD tenant
- Users who are not.
If your new user is part of another Azure AD tenant, then when we add them as a guest user to you application and they will use the credentials provided by their own organisation. This means they do not have to remember a new username and password when they want to access your application. It is also useful as they will be managed by their own organisation so you will not be responsible for resetting their passwords for example. Another advantage of using their own Azure AD credentials is that they will lose the ability to sign in to your application when their accounts are disabled or removed from your customer’s tenant. They will however still exist as a guest user in your application but they will no longer be able to sign in.
If your new user is not part of another Azure AD tenant, then they will automatically have a Microsoft account created for them. They will also be prompted to enter a new password. Again this is not managed by you but by Microsoft this time, so password resets are handled by a link provided by them.
To assign a guest user to your application you will need to invited them to use your application. They will then receive an invitation via email that they will need to redeem in order to access your application.
So, go back to the Azure AD blade of the Azure portal and click on Users:
The click on “New guest user”
Fill in the form and enter your own personal message and click “Invite”. You need to enter a valid email address otherwise the user will not be able to receive the invite, as seen below:
The text highlighted inside the red box was the custom message I entered in the invitation process. It is possible to change the branding of this email but it is an Azure AD premium feature.
The invite process proves that the user has access to the mail box linked to the email address used. Also, if they are using their organisations Azure AD email address then they must also sign in with their own username an password so you can be confident that they user is who they say they are. This example shows the flow when a user is part of another Azure AD tenant. If the user is not part of another tenant then there will be additional screens for setting up their new Microsoft account and password.
When the user clicks the Accept invitation link they will be redirected to a consent page which is asking for permissions to read their user profile from their Azure AD tenant.
Accepting the permissions then will redirect the user to the application portal where the user can access the applications they have been assigned. As we have not allocated any applications to this user yet, they will not see anything,
To assign applications to the users, go back to the Azure AD blade in teh Azure portal and click on Users then click on the one you have just added to view their profile:
You can see, in this example, in the red box that this is a Guest user who has accepted the invitation.
Click on applications in the left hand menu bar you will see that there are none assigned. To assign this user to an application, navigate back to the Azure AD main blade and click Enterprise applications, then select the application you wish to assign this user to.
Click “Assign users and groups”, then Add User
Click “None Selected” then search for your new user, select them and click Select.
Now click Assign
The new users is now assigned. Go back to the Application screen the user viewed after they signed in and refresh the page
The assigned application should now be visible and clicking the application will redirect the user to that applications web site.
Using Azure AD it is easy to now invite users to user your applications and when they are part of another Azure AD tenant, Azure AD takes all the pain out of federating with these new users tenants. Hopefully you have found that this is straight forward and this will have opened up access to your applications in a controlled way. My next post will look at how we can automate this using Graph API.
Introduction to Azure Role Based Access Control (RBAC)
Up until fairly recently I have been managing access to a number of Azure subscriptions but as I’ve been working for smaller organisations the number of people who needed access was fairly small and easy to manage. It also meant that I generally gave the users Owner or Contributor access to the subscriptions as we were all managing everything so we needed the access at that level. Now I work for a large organisation there is a greater need to limit access to certain areas of Azure and giving subscription wide access is limited to a few key administrators. This means that I need to look at the minimum access that is required for each of the users who need access to the resources. First I’d like to talk about the scope within which permissions can be set within Azure. For most of the scenarios I’ve worked in I have visibility of a single subscription. For organisations with a large number of subscriptions there is a further level of scope, Management group, which I won’t be discussing.
Permissions can be set at the Subscription, Resource group or the individual resource scope.Depending upon the level of access your user requires there are three basic levels which you can use
- Owner
- Contributor
- Reader
Owner gives the user full access to everything within the scope and can also assign roles to other users.
Contributor gives the user full access to everything within the scope except they are not able to assign roles to other users
Reader give the user access to view the resources within the scope but they are not able to change anything or assign roles.
So assigning the user the Owner role at the Subscription level, then the user can manage all resources within the subscription and assign roles to users. A user can be assigned multiple roles and Azure RBAC is additive so if a user was assigned Contributor at the subscription scope but only Reader on one of the resource groups, the Contributor role would override the reader role. It is also possible to have Deny role assignments. Where a user is Denied permissions on a specific role. Deny assignments take precedence over role assignments.
These roles plus the variety of scopes give some flexibility of access but it is still a large surface area of access that is provided. Azure offers a large number of finer grained roles to allow users to be given specific permissions to specific services. There are a large number of built in assignments as can be seen here: https://docs.microsoft.com/en-us/azure/role-based-access-control/built-in-roles
These finer grained roles allow you to set specific permissions on a specific user within a specific scope. For example if you wanted to give a user access to a blob store to upload files via the Azure portal there are two permissions that can be set: Reader and Data Access and
Storage Blob Data Contributor. If you assign these two roles to a user in the storage account, then the user is able to login to the Azure portal and navigate to the storage account and access the blob store.
To do this, navigate to the storage account within which you want to assign a role and click the access control item
The click “Add role assignment”
In the role drop down pick “Storage Blob Data Contributor”, select the user you want to assign the role to and click save. Repeat this for the Reader and Data Access role. Your user now has access only to blob storage and has no access elsewhere in the resource group or subscription. I could have done the same thing by selecting the resource group and Access control and adding these roles there. This would have give the user access to all blob stores within the resource group.
Another example is that you may want to give someone access to your app service so that they can configure and deploy. So navigate to your App Service and click “Access control”, then select the role “Website Contributor”. See https://docs.microsoft.com/en-us/azure/role-based-access-control/built-in-roles#website-contributor for more details. This lets you manage the selected website but not app service plans and no other web sites. If you want to manage other app services then you could add the same role at the resource group level.
Managing Application Access with Azure AD – Part 1
In my next series of blog post I want to talk about how to manage access to applications using Azure AD.
I’ve been looking at how I can set up access to my web based applications and I want to be able to:
- Have a single sign on with multiple applications
- Allow some users access to only some of the applications
- Be able to give access to users outside of my organisation
- Be able to control access via code
Part 1 will cover setting my applications up and then restricting access to the applications via Azure AD.
In order to test this I needed to have a number of applications that I could use. I used this example:
https://github.com/AzureADQuickStarts/AppModelv2-WebApp-OpenIDConnect-DotNet
It allows me to login and see my claims. I deployed this into two different app services so I could navigate to them separately. I’m not going to talk about the code on the web side apart from the bits you need to configure up the sample. This series of blogs are more about how to setup Azure AD and the path I went through to my end goal of configuring up users programmatically.
In order to integrate with Azure AD we need to set up each of the applications. This will provide us with an ID with which we can use to configure each of the applications.
In Azure Portal navigate to Azure Active Directory, or search for it in the search bar
In the menu bar on the left select App Registrations –> New registration and complete the form:
I've picked single tenant as I want to invite users using B2B. Now click Register
You need to copy the ID's needed for your web app:
Copy the Client ID and Tenant ID. Repeat this process for the next app. I've created two apps as I wanted to test limiting access to a single app and deny access to the second if the users has not been invited to it or added manually.
Now add these to the web.config in the sample app. There will be two settings for ClientId and Tenant. Make sure that the redirect url matches the url of the application you registered and redeploy. Repeat this for the second application.
If you navigate to the web apps and try and login, you may get an error as we haven't setup any users, although any users currently in your Azure AD should be able to login.
To give users access to your app. Go back to Azure Active Directory and this time select Enterprise Applications and click on the app you just created.
Click Users and groups
Click Add user
Click None Selected, pick users from the list and click Select. These users have now been given access to your application. However, as I mentioned earlier all users who are part of your Azure AD currently are able to login to your web app, we need to now configure the app so that only assigned users can access it.
Click Properties in your enterprise application and set User Assignment required to yes and click Save. (repeat this for your other application)
Now only users who are assigned to your application can login. You can test this now. Go to the first application url and login with one of the users you assigned. Then go to the second app (you shouldn't have assigned any users just yet.) and login. This time you will get an error.
You can now assign users to the second application and the error should go away when you attempt to login.
We’ve now set up our applications in Azure AD and limited access to each application. In my next post I’ll show you how you can then add users from outside of your organisation to these applications.
Exporting Logs from Application Insights using Continuous Export
This is the fifth post of my series of posts about Log Analytics and Application Insights. The previous post talked about adding custom logging to you code using Application Insights. Now you’ve got your logging into Application Insights you can run log analytics queries and build dash boards, alerts etc. Sometime though you want to use this data in other systems and it would be useful if you could export the data and use it else where. This post will show you how you can regularly export the data from Application Insights into Azure Storage. Once it is in Storage it can easily be moved into other systems or used else where such as PowerBI. This can be achieved using the Continuous Export feature of Application Insights
To enable Continuous Export, login to the Azure management portal and navigate to Application Insights. Click on the instance you want Continuous Export enabled. The scroll down the options on the left until you find the Configure section and click on Continuous Export
To use Continuous Export you will need to configure a storage account. Click Add:
Click Data types to export:
I was only interested in the logs I emitted from my custom logging, so selected Custom Event, Exception and Trace then clicked OK
Next pick a storage location. Make sure you use one where your Application Insights instance is located otherwise you will be charged egress fees to move the data to a new datacentre.
You can now pick an existing storage account or create a new one. Upon selecting a storage account you can pick an existing container or create a new one.Once a blob container is selected click OK.
Continuous Export is now configured. You will not see anything in the storage container until the next set of logs are sent to Application Insights.
My log analytics query shows the following logs have been generated:
If you look at the continuous Export configuration page you will see that the last updated date has changed.
Now look in blob storage. You should see a folder that is<ApplicationInsights_ServiceName>_<ApplicationInsights_InstrumentationKey
Click through and you will see a number of folders. One for each of the logs that I enabled when setting up Continuous Export. Click through one of them and you will see a folder for the date and then a folder for the hour of the logs. Then a file containing the logs for that hour.
You will get a row of json data for each row output for the log query. Note the whole logs emitted will be in each of the folders.
e.g.
{
"event": [
{
"name": "Some Important Work Completed",
"count": 1
}
],
"internal": {
"data": {
"id": "a guid is here",
"documentVersion": "1.61"
}
},
"context": {
"data": {
"eventTime": "2020-03-22T18:12:30.2553417Z",
"isSynthetic": false,
"samplingRate": 100.0
},
"cloud": {},
"device": {
"type": "PC",
"roleInstance": "yourcomputer",
"screenResolution": {}
},
"session": {
"isFirst": false
},
"operation": {},
"location": {
"clientip": "0.0.0.0",
"continent": "Europe",
"country": "United Kingdom",
"province": "Nottinghamshire",
"city": "Nottingham"
},
"custom": {
"dimensions": [
{
"CustomerID": "4df16004-2f1b-48c0-87d3-c1251a5db3f6"
},
{
"OrderID": "5440d1cf-5d06-4b0e-bffb-fad522af4ad1"
},
{
"InvoiceID": "a7d5a8fb-2a2e-4697-8ab4-f7bf8b8dbe18"
}
]
}
}
}
As the data is now out of Application Insights you can move it where ever you need it. You will also need to manage the blob storage data too otherwise you will end up with the logs stored in two places and the storage costs will be doubled.
One example of subsequent usage is exporting the data to Event Hub. As the data is in blob storage you can use a function with a blob trigger to read the blob in a row at a time and publish the data onto Event Hub:
[FunctionName("ContinuousExport")]
public static async void Run([BlobTrigger("logs/{name}", Connection = "ConitnuousExportBlobSetting")]Stream myBlob, string name,
[EventHub("logging", Connection = "EventHubConnectionAppSetting")] IAsyncCollector<string> outputEvents,TraceWriter log)
{
log.Info($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
StreamReader textReader = new StreamReader(myBlob);
while (!textReader.EndOfStream)
{
string line = textReader.ReadLine();
log.Info(line);
await outputEvents.AddAsync(line);
}
}
Note: This is an example, so will need additional code to make sure that you don’t exceed the Event Hub maximum message size
So with Continuous Export you can extract your log data from Application insights and move it to other systems for processing.
Processing data from IoT Hub in Azure Functions
If you have been following my previous posts (Part 1, part 2, part 3) you will know that I’m using an ESP 8266 to send data to the Azure IoT hub. This post will show you how to receive that data and store it in Azure Storage and also show how you can also forward the data onto the Azure Service Bus.
I’m going to use Visual Studio and C# to write my function. If you are unfamiliar with Azure functions you can setup bindings to a variety of Azure resources. These bindings make it easy to interface without needing to write a lot of boiler plate code. These bindings allow your function to be triggered when something happens on the resource or also use the output bindings to write data to these resources. For example, there are bindings for Blob and Table storage, Service bus, Timers etc. We’re interested in the IoT hub binding. The IoT hub trigger will be fired when an event is sent to the underlying Event hub. You can also use an output binding to put messages into the IoT hub event stream. We’re going to use the Table storage and Service bus output bindings.
To get started you need to create a new Function project in Visual Studio.
Select IoT hub trigger and browse to a storage account you wish to use (for logging) plus add in the setting name you want to use to store the IoT hub connection string.
This will generate your empty function with you preconfigured IoT hub trigger.
You need to add your IoT hub connection string to your setting file. Open local.settings.json and add in a new line below the AzureWebjobs settings with the same name you entered in the dialog. ConnectionStringSetting in my example.Your connection string can be found in the Azure Portal.
Navigate to your IoT hub, then click Shared Access Policies
Select the user you want to use to access the IoT hub and click the copy icon next to the primary key connection string.
You can run this in the Visual Studio debugger and when messages are sent to your IoT hub you should see a log appearing in the output window.
What I want to do is to receive the temperature and humidity readings from my ESP 8266 and store the data in Azure storage so that we can process it later.
For that I need to use the Table storage output binding. Add the binding attribute to your function below the FunctionName binding.
[return: Table("MyTable", Connection = "StorageConnectionAppSetting")]
Again, you will need to add the storage setting into your config file. Find your storage account in the Azure portal, click Access keys then copy the key1 connection string and paste it in your config file
To use Azure Storage Output binding you will need to create a class that represents the columns in you table.
I included a device id so that I can identify which device the reading we associated to. You will need to change the return type of your function to be TempHumidityIoTTableEntity then add the code to extract the data from the message.
Firstly, I changed the python code in my ESP8266 to send the data as json so we can process it easier. I’ve also added a message identifier so that we can send different messages from the ESP8266 and be able to process them differently.
sensor.measure()
dataDict = {'partitionKey': 'r',
'rowkey':'recneptiot'+str(utime.ticks_ms()),
'message':'temphumidity',
'temperature':str(sensor.temperature()),
'humidity': str(sensor.humidity())}
mqtt.publish(sendTopic,ujson.dumps(dataDict),True)
That means we can serialise the Iot Hub message into something we can easily access. So the whole function is below:
[FunctionName("Function1")]
[return: Table("yourtablename", Connection = "StorageConnectionAppSetting")]
public static TempHumidityIoTTableEntity Run([IoTHubTrigger("messages/events", Connection = "ConnectionStringSetting")]EventData message, TraceWriter log)
{
var messageAsJson = Encoding.UTF8.GetString(message.GetBytes());
log.Info($"C# IoT Hub trigger function processed a message: {messageAsJson}");var data = JsonConvert.DeserializeObject<Dictionary<string, string>>(messageAsJson);
var deviceid = message.SystemProperties["iothub-connection-device-id"];
return new TempHumidityIoTTableEntity
{
PartitionKey = deviceid.ToString(),
RowKey = $"{deviceid}{message.EnqueuedTimeUtc.Ticks}",
DeviceId = deviceid.ToString(),
Humidity = data.ContainsKey("humidity") ? data["humidity"] : "",
Temperature = data.ContainsKey("temperature") ? data["temperature"] : "",
DateMeasured = message.EnqueuedTimeUtc.ToString("O")
};}
Providing your config is correct you should be able to run this in the Visual Studio debugger and view your data in Table Storage:
I mentioned at the start that I wanted to pass some messages onto the Azure Service bus. For example we may want to do something if the humidity goes above 60 percent. In this example we could add a HighHumidity message to service bus for some other service or function to respond to. We’ll send the message as a json string so that we can action it later in a different service. You can easily add a Service Bus output binding to your function. However, this binding documentation shows it as another return value. There is an alternative binging that allows you to set a message string out parameter with the message contents. This can be used as follows:
[FunctionName("Function1")]
[return: Table("yourtablename", Connection = "StorageConnectionAppSetting")]
public static TempHumidityIoTTableEntity Run([IoTHubTrigger("messages/events", Connection = "ConnectionStringSetting")]EventData message,
[ServiceBus("yourQueueOrTopicName", Connection = "ServiceBusConnectionSetting", EntityType = EntityType.Topic)]out string queueMessage,
TraceWriter log)
{
var messageAsJson = Encoding.UTF8.GetString(message.GetBytes());
log.Info($"C# IoT Hub trigger function processed a message: {messageAsJson}");var data = JsonConvert.DeserializeObject<Dictionary<string, string>>(messageAsJson);
var deviceid = message.SystemProperties["iothub-connection-device-id"];
queueMessage = null;
if (data.ContainsKey("humidity"))
{
int humidity = int.Parse(data["humidity"]);if (humidity > 60)
{
Dictionary<string, string> overHumidityThresholdMessage = new Dictionary<string, string>
{
{ "deviceId",deviceid.ToString()},
{ "humidity", humidity.ToString()},
{"message", "HighHumidityThreshold" }
};
queueMessage = JsonConvert.SerializeObject(overHumidityThresholdMessage);
}
}return new TempHumidityIoTTableEntity
{
PartitionKey = deviceid.ToString(),
RowKey = $"{deviceid}{message.EnqueuedTimeUtc.Ticks}",
DeviceId = deviceid.ToString(),
Humidity = data.ContainsKey("humidity") ? data["humidity"] : "",
Temperature = data.ContainsKey("temperature") ? data["temperature"] : "",
DateMeasured = message.EnqueuedTimeUtc.ToString("O")
};}
}
We now have a function that reads the device temperature and humidity reading into table storage and then sends a message to a Service Bus Topic if the temperature goes above a threshold value.
Generating your IoT Hub Shared Access Signature for your ESP 8266 using Azure Functions
In my last 2 posts I showed how you can connect your ESP 8266 to the IoT hub to receive messages from the hub and also to send messages. One of the issue I had was generating the Shared Access Signature (SAS) which is required to connect to the IoT hub. I was unable to generate this on the device so I decided to use Azure Functions. The code required is straight forward and can be found here: https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-security#security-tokens
To create an Azure Function, go to the Azure management portal click the menu icon in the top left and select “Create a Resource”
Search for “Function”
and select “Function App” and click Create
Complete the form
And click Review and Create to accept the defaults or click next and work through the wizard if you want to change from the default values.
Click create to kick of the deployment of your new Azure Function. Once the deployment is complete navigate to the Function by clicking “Go To Resource”. You now need to create your function.
Click the + sign next to “Functions”. I used the In-portal editor as it was the easiest to use at the time as I already had most of the code copied from the site mentioned above.
Click In-Portal, then Continue and choose the Webhook + API template and click Create
Your function is now ready for editing. It will have some default code in there to give you an idea how to start
We’re going to use the previous SAS code in here and modify it to accept a json payload with the parameters you need for the SAS to be created.
The json we’ll use is as follows:
{
"resourceUri":"[Your IOT Hub Name].azure-devices.net/devices/[Your DeviceId]",
"expiryInSeconds":86400,
"key":"[SAS Key from IoT hub]"
}
You can get you SAS key from the IoT hub in the Azure Portal in the devices section. Click on the device
Then copy the Primary or Secondary key.
Back to the function. In the editor Paste the following code:
C# function
#r "Newtonsoft.Json"
using System;
using System.Net;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using Newtonsoft.Json;
using System.Globalization;
using System.Net.Http;
using System.Security.Cryptography;
using System.Text;
public static async Task<IActionResult> Run(HttpRequest req, ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
string token = "";
try
{
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
dynamic data = JsonConvert.DeserializeObject(requestBody);
int expiryInSeconds = (int)data?.expiryInSeconds;
string resourceUri = data?.resourceUri;
string key = data?.key;
string policyName = data?.policyName;
TimeSpan fromEpochStart = DateTime.UtcNow - new DateTime(1970, 1, 1);
string expiry = Convert.ToString((int)fromEpochStart.TotalSeconds + expiryInSeconds);
string stringToSign = WebUtility.UrlEncode(resourceUri) + "\n" + expiry;
HMACSHA256 hmac = new HMACSHA256(Convert.FromBase64String(key));
string signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));
token = String.Format(CultureInfo.InvariantCulture, "SharedAccessSignature sr={0}&sig={1}&se={2}", WebUtility.UrlEncode(resourceUri), WebUtility.UrlEncode(signature), expiry);
if (!String.IsNullOrEmpty(policyName))
{
token += "&skn=" + policyName;
}
}
catch(Exception ex)
{
return (ActionResult)new OkObjectResult($"{ex.Message}");
}
return (ActionResult)new OkObjectResult($"{token}");
}
Click Save and Run and make sure that there are no compilation errors. To use the function you need to post the json payload to the following address:
https://[your Function Name].azurewebsites.net/api/HttpTrigger1?code=[your function access key]
To retrieve your function access key, click Manage and copy your key from the Function Keys section
We’re now ready to use this in micropython on your ESP 8266. I created a function to retrieve the SAS
def getsas(hubname, deviceid, key):
import urequests
import ujson
dict = {}
dict["resourceUri"] = hubname+'.azure-devices.net/devices/'+deviceid
dict["key"] = key
dict["expiryInSeconds"]=86400
payload = ujson.dumps(dict)
response = urequests.post('https://[your function name].azurewebsites.net/api/HttpTrigger1?code=[your function access key]', data=payload)
return response.text
In my connectMQTT() function from the first post I replaced the hard coded SAS string with a call to the getsas function. The function returns a SAS which is valid for 24 hours so you will need to retrieve a new SAS once 24 hours has elapsed.
I can now run my ESP 8266 code without modifying it to give it a new SAS each time I want to use it. I always forgot and wondered why it never worked the next time I used it. I can now both send and receive data from/to the ESP 8266 and also generate a SAS to access the IoT hub. The next step is to use the data received by the hub in an application and send action messages back to the ESP 8266 if changes are made. I look forward to letting you know how I got on with that in a future post.
Sending data from the ESP 8266 to the Azure IoT hub using MQTT and MicroPython
In my previous post I showed you how to connect your ESP 8266 to the Azure IoT hub and be able to receive messages from the IoT hub to turn on a LED. In this post I'll show you how to send data to the IoT hub. For this I need to use a sensor that I will read at regular intervals and then send the data back to the IoT hub. I picked a temperature and humidity sensor I had from the kit of sensors I bought
This sensor is compatible with the DHT MicroPython library. I order to connect to the IoT hub use the same connect code that is in my previous post. The difference with sending is you need a end point for MQTT to send you temperature and humidity data to. The topic to send to is as follows:
devices/<your deviceId>/messages/events/
So using the same device id as in the last post then my send topic would be devices/esp8266/messages/events/
To send a message to the IoT hub use the publish method. This needs the topic plus the message you want to send. I concatenated the temperature and humidity and separated them with a comma for simplicity
import dht
import time
sensor = dht.DHT11(machine.Pin(16))
mqtt=connectMQTT()
sendTopic = 'devices/<your deviceId>/messages/events/'
while True:
sensor.measure()
mqtt.publish(sendTopic,str(sensor.temperature())+','+str(sensor.humidity()),True)
time.sleep(1)
The code above is all that is required to read the sensor every second and send the data to the IoT hub.
In Visual Studio Code with the Azure IoT Hub Toolkit extension installed, you can monitor the messages that are sent to your IoT hub. In the devices view, right click on the device that has sent the data and select “Start Monitoring Built-in Event Endpoint”
This then displays the messages that are received by your IoT hub in the output window
You can see in the body of the received message the temperature and humidity values that were sent.
I still need to sort out generating the Shared Access Signature and also programmatically access the data I send to the IoT hub. I hope to have blog posts for these soon.
Connecting the ESP 8266 to Azure IoT Hub using MQTT and MicroPython
Recently was introduced to the ESP 8266 processor which is a low cost IoT device with built in Wi-Fi, costing around £3 - £4 for a development board. The thing that interested me (apart from price) was the device is Arduino compatible and will also run MicroPython. The version I purchased from Amazon was the NodeMcu variant with built in power and serial port via a microUsb port, so it makes an ideal board to start with as there are no additional components required.
This board however did not have MicroPython installed and that required a firmware change. The instructions were fairly straight forward and I followed this tutorial.
After installing MicroPython you can connect to the device using a terminal emulator via the USB serial port. Check in Device Manager to find the COM port number and the default baud rate is 115200. I used the Arduino Serial Monitor tool. In the terminal emulator you can press enter and you should get back the python REPL prompt. If not then you have the COM port or Baud rate wrong.
You can write you python directly into here but its easier to write the python in you PC then run it on the device. For this I use ampy
In Command Prompt install ampy using:
pip install adafruit-ampy
This allows you to connect to your device. Close the terminal emulator to free up the COM port then type the following to list the files on your device:
ampy --port COM4 --baud 115200 ls
The MicroPython Quick Ref will summarise how to access the GPIO ports etc but in order to connect to the IoT hub you will need to configure the Wi-Fi on the device. This can be done using the network module.
So create a new text file on your PC and write the code to connect to your Wi-Fi. To test this you can use ampy to run the python on the device:
ampy --port COM4 --baud 115200 run networking.py
Its a good idea to use print statements to help debug as once the run has complete the output will be reflected back in your Command Prompt.
Now you are connected to Wi-Fi we can start to look at connecting to the IoT hub. I am assuming that you already have your IoT hub set up. We now need to configure you new device. Navigate to the IoT hub in your Azure Portal. In Explorers click IoT Devices, then New
Enter your device id, the name your device will be known as. All your devices need a name that is unique to your IoT hub. Then click Save. This will auto generate the keys needed to generate the shared access signature needed to access the IoT hub later.
Once created you may need to click refresh in the devices list to see you new device. Click the device and copy the primary key, you will ned this for later to generate the Shared Access Signature used in the connection string. In order to generate a new Shared Access Token you can use Visual Studio Code with the Azure IoT Hub Toolkit extension installed. This puts a list of devices and endpoints in the explorer view and allows you to create a new Shared Access Token. find your device in the Devices list, Right click and select Generate SAS Token For Device
You will be prompted to enter the number of hours the token is valid for and the new SAS token will appear in the output window:
SharedAccessSignature sr=[your iothub name].azure-devices.net%2Fdevices%2Fesp8266&sig=bSpX6UMM5hdUKXHfTagZF7cNKDwKnp7I3Oi9LWTZpXI%3D&se=1574590568
The shared access signature is made up of the full address of your device, a time stamp indicating how long the signature is valid for and the whole thing is signed. You can take this an use it to test your access to IoT hub, so make sure you make the time long enough to allow you to test. The ESP8266 doesn't have a clock that can be used to generate the correct time so you will need to create the SAS off board. I’m going to use an Azure function with the code here to generate it.
Back to Python now. In order to connect to the IoT hub you will need to use the MQTT protocol. MicroPython uses umqtt.simple.
There are a few things required before you can connect.
Firstly the Shared Access Signature that you created above.
Next you will need to get the DigiCert Baltimore Root certificate that Iot Hub uses for SSL. This can be found here. Copy the text from -----BEGIN CERTIFICATE----- to -----END CERTIFICATE-----, including both the Begin and End lines. Remove the quotes and replace the \r\n with real new line in your text editor then save the file as something like baltimore.cer.
Next you will need a ClientId. For IoT hub the ClientId is the name of your device in IoT Hub. In this example it is esp8266
Next you will new a Username. For IoT hub, this is the full cname of your IoT Hub with your client id and a version. e.g. [your iothub name].azure-devices.net/esp8266//?api-version=2018-06-30
The following code should allow you to connect to the IoT Hub:
def connectMQTT():
from umqtt.simple import MQTTClientCERT_PATH = "baltimore.cer"
print('getting cert')
with open(CERT_PATH, 'r') as f:
cert = f.read()
print('got cert')
sslparams = {'cert':cert}CLIENT_ID='esp8266'
Username='yourIotHub.azure-devices.net/esp8266/?api-version=2018-06-30'
Password='SharedAccessSignature sr=yourIotHub.azure-devices.net%2Fdevices%2Fesp8266&sig=bSpX6UMM5hdUKXHfTagZF7cNKDwKnp7I3Oi9LWTZpXI%3D&se=1574590568'
mqtt=MQTTClient(client_id=CLIENT_ID,server='yourIotHub.azure-devices.net',port=8883,user=Username,password=Password, keepalive=4000, ssl=True, ssl_params=sslparams)
mqtt.set_callback(lightLed)
mqtt.connect(False)mqtt.subscribe('devices/esp8266/messages/devicebound/#')
flashled(4,0.1, blueled)
return mqtt
set_callback requires a function which will be called when there is a device message sent from the IoT Hub. Mine just turns a Led on or off
def lightLed(topic, msg):
if msg== b'on':
statusled.on()
else:
statusled.off()
connect(False) means that the topic this device subscribes to will persist after the device disconnects.
I’ve also configured the device to connect to its bound topics so that any message sent to the device will call the callback function.
Now we need to have a process loop so that we can receive the messages. The ESP8266 does not seem to run async code so we need to call the wait_msq function to get any message back from the IoT hub
mqtt=connectMQTT()
print('connected...')
while True:
mqtt.wait_msg()
save your python as networking.py (and make sure that all the code you wrote initially to connect to Wi-Fi is included) then run ampy again:
ampy --port COM4 --baud 115200 run networking.py
Your device should run now. I’ve used the Led flash to show me progress for connecting to Wi-Fi then connecting to IoT Hub and also through to receiving a message. There is a blue LED on the board which I’ve been using as well as a standard LED which is turned on/off based upon the device message received from the IoT Hub. The blue LED is GPIO 2.
In order to send a message from the IoT hub to your device then you can do this from the Azure Portal in the devices view. Click on the device then click Message To Device
Enter the Message Body (on or off) and click Send Message
Alternatively you can do this in Visual Studio Code by right clicking the device and selecting Send C2D Message To Device and enter the message in the box that pops up
In my example the Led lights when I enter on and turns off when I enter off. ampy is likely to timeout during this process, but that’s ok as the board will still be running. As we’ve put the message retrieval inside a loop then the board will continue to run. To stop it running you will need to reset the board by pressing the reset button.
My next step is to sort out automatically generating the Shared Access Signature and then I’ll look at sending data to the IoT Hub
Migrating Azure Scheduled Web Jobs to Logic Apps
If you have scheduler jobs running in Azure you may have received an email recently stating that the scheduler is being retired and that you need to move your schedules off by 31st December 2019 at the latest and you also will not be able to view your schedules via the portal after 31st October.
This is all documented in the following post: https://azure.microsoft.com/en-us/updates/extending-retirement-date-of-scheduler/
There is an alternative to the Scheduler and that is Logic Apps and there is a link on the page to show you how to migrate.
I’m currently using the scheduler to run my webjobs on various schedules from daily and weekly to monthly. Webjobs are triggered by using an HTTP Post request and I showed how to set this up using the scheduler in a previous post :
Creating a Scheduled Web Job in Azure
I will build on that post and show how you can achieve the same thing using Logic Apps. You will need the following information in order to configure the Logic App: Webhook URL, Username, Password
You can find these in the app service that is running your webjob. Click “Webjobs”, select the job you are interested in the click “Properties”. This will display the properties panel where you can retrieve all these values.
Now you need to create a Logic App. In the Azure Portal dashboard screen click “Create a Resource” and enter Logic App in the search box, then click “Create”
Complete the form and hit Create
Once the resource has created you can then start to build your schedule. Opening the Logic App for the first time should take you to the Logic App Designer. Logic Apps require a trigger to start them running and there are lots of different triggers but the one we are interesting in, is the Recurrence trigger
Click “Recurrence” and this will be added to the Logic App designer surface for you to configure
I want to set my schedule to run at 3am every day so I select frequency to be Day and interval to be 1, then click “Add New Parameter”
Select “At these hours” & “At these minutes”. Two edit boxes appear and you can add 3 in the hours box and 0 in the minutes box. You have now set up the schedule. We now need to configure the Logic App to trigger the web service. As as discussed above we can use a web hook.
All we have in the Logic App is a trigger that starts the Logic App at 3am UTC, we now need to add an Action step that starts the web job running.
Below the Recurrence box there is a box called “+ New Step”, click this and then search for “HTTP”
Select the top HTTP option
Select POST as the method and Basic as Authentication, then enter your url, username and password
The web job is now configured and the Logic App can be saved by clicking the Save button. If you want to rename each of the steps so you can easily see what you have configured then click “…” and select “Rename”
You can test the Logic App is configured correctly by triggering it to run. This will ignore the schedule and run the HTTP action immediately
If the request was successful then you should see ticks appear on the two actions or if there are errors you will see a red cross and be able to see the error message
If the web job successfully ran then open the web job portal via the app services section to see if your web job has started.
If you want to trigger a number of different web jobs on the same schedule then you can add more HTTP actions below the one you have just set up. If you want to delay running a job for a short while you can add a Delay task.
If you want to run on a weekly or monthly schedule then you will need to create a new Logic App with a Recurrence configured to the schedule you want and then add the HTTP actions as required.
The scheduler trigger on the Logic App will be enabled as soon as you click Save. To stop it triggering you can Disable the Logic App on the Overview screen once you exit the Designer
Hopefully this has given you an insight in to how to get started with Logic Apps. Take a look at the different triggers and actions as see that you can do a lot more than just scheduling web jobs
Adding Application Insights Logging to your code
This is the fourth of a series about Application Insights and Log analytics. I’ve shown you how to add existing logs, using the log analytics query language to view you logs and how to enhance your query to drill down and get to the logs you are interested in. This post is about how you can add logs from your code and provide the information to allow you refine your queries and help you to diagnose your faults more easily
If you don’t already have application insights then you can create a new instance in the Azure portal (https://portal.azure.com/)
Get your application insights key from the azure portal. Click on your application insights instance and navigate to the Overview section then copy your instrumentation key. You will need this in your code.
In your project, add application insights via nuget :
Install-Package Microsoft.ApplicationInsights -Version 2.10.0
In you code you need to assign the key to Application Insights as follows:
TelemetryConfiguration configuration = TelemetryConfiguration.CreateDefault();
configuration.InstrumentationKey = “put your key here”;
To log details using application insights then you need a telemetry client.
TelemetryClient telemetry = new TelemetryClient(configuration);
The telemetry client has a larger number of features than I am not going to talk about here as I am just interested in logging today. There are three methods of interest: TrackEvent, TrackException and TrackTrace.
I use TrackEvent to log out things like start and end of methods of if something specific occurs that I want to log; TrackException is for logging Exception details and TrackTrace is for everything else.
telemetry.TrackEvent("Some Important Work Started");
try {
telemetry.TrackTrace("I'm logging out the details of the work that is being done", SeverityLevel.Information); } catch(Exception ex) {
telemetry.TrackException(ex); } telemetry.TrackEvent("Some Important Work Completed");
You now have the basics for logging. This will be useful to some extent, but it will be difficult to follow the traces when you have a busy system with logs of concurrent calls to the same methods. To assist you to filter your logs it would be useful to provide some identifying information that you can add to your logs to allow you to track and trace calls through your system. You could add these directly to your logs but this then makes your logs bloated and difficult to read. Application Insights provides a mechanism to pass properties along with the logs which will appear in the Log Analytics data that is returned from your query. Along with each log you can pass a dictionary of properties. I add to the set of properties as the code progresses to provide identifying information to assist with filtering the logs.I generally add in each new identifier as they are created. I can then use these in my queries to track the calls through my system and remove the ones I am not interested in. Diagnosing faults then becomes a lot easier. To make this work then you need to be consistent with the naming of the properties so that you always use the same name for the same property in different parts of the system. Also try and be consistent about when you use TrackEvent and TrackTrace. You can set levels for your traces based upon the severity level (Verbose, Information, Warning, Error, Critical)
TelemetryConfiguration.Active.InstrumentationKey = Key; TelemetryClient telemetry = new TelemetryClient(); var logProperties = new Dictionary(); logProperties.Add("CustomerID", "the customer id pass through from elsewhere"); telemetry.TrackEvent("Some Important Work Started", logProperties); try { var orderId = GenerateOrder(); logProperties.Add("OrderID", orderId.ToString()); telemetry.TrackTrace("I just created an order", logProperties); var invoiceId = GenerateInvoice(); logProperties.Add("InvoiceID", invoiceId.ToString()); telemetry.TrackTrace("I've just created an invoice", logProperties); SendInvoice(invoiceId); } catch (Exception ex) { telemetry.TrackException(ex, logProperties); } telemetry.TrackEvent("Some Important Work Completed", logProperties);
telemetry.Flush();
Flush needs to be called at the end to ensure that the data is sent to Log Analytics. In the code above you can see that I’ve added a CustomerId, OrderId and InvoiceId to the log properties and pass the log properties to each of the telemetry calls. Each of the logs will contain the properties that were set at the time of logging. I’ve generally wrap all this code so that I do not have to pass in the log properties into each call. I can add to the log properties whenever I have new properties and then each of the telemetry calls will include the log properties.
When we look at the logs via log analytics will can see the additional properties on the logs and then use them in our queries.
The log properties appear in customDimensions and you can see how the invoice log has the invoice id as well as the customer id and order id. The order log only has the customer id and order id.
You can add the custom dimensions to your queries as follows:
union traces, customEvents, exceptions
|order by timestamp asc
| where customDimensions.CustomerID == "e56e4baa-9e1d-4c3c-b498-365bf2807a5f"
You can also see in the logs the severity level which allows you to filter your logs to a sensible level. You need to plan your logs carefully and set an appropriate level to stop you flooding your logs with unnecessary data until you need it.
I’ve now shown you how to add logs to your application. You can find out more about the other methods available on the telemetry api here