Processing real-time video stream using Nvidia Jetson Nano and AI model on the Edge

Just another WordPress site

Processing real-time video stream using Nvidia Jetson Nano and AI model on the Edge

Hello everyone, and welcome to the Cloud Lunch and Learn sessions. My name is Hugo Barona, and today we have Victor with us, talking about IOT. So I handover to you Victor. Thank you Hugo. Hello everybody, I hope you are having an amazing beginning of the week. so, we’re going through the same. My name is Victor Rodrigues and i am an Azure Cloud Specialist, for data modernization IOT machine learning and AI at storm technology I’m really doing the amazing world of IoT from the last three years have been working on a couple of projects related to Vehicle Onboard for fleet management and his Smart offices so feel free to connect with me on LinkedIn and Twitter so let’s begin so let’s cover the agenda for today’s presentation I would like to cover an introduction of Azure IoT edge and how you can utilize them to build resilient IoT Solutions and We look then into the Nvidia Jetson Developer Kit that allows you to create AI workloads that you can run on small factory GPU produce by Nvidia and then take a look on Nvidia DeepStream SDK and then we can go ahead and deploy an Azure Custom Vision AI module using Azure IoT Edge running on the Nvidia Jetson Nano and finally, we can conclude with a slide with an additional resource on how to get started with Nvidia Jetson Nano Developer Kit and Azure IoT Edge and Hub So let’s talk about Azure IoT Edge It’s essentially a run time and can be deployed to microcomputer, that’s allows you to deploy your IoT Workloads in a containerized models this model there’s first-class support for device SDKs like Python NodeJS, Java and C and allows you to produce you produce telemetry using a AMQP and MQTT for the the transport that allows you to pass your faster time for a transmission from device to the cloud You can also operate in offline offline or intermediate natural conditions mean that if you have one of one of these models deployed let’s say a ship container that lose internet connectivity at the sea whenever this connectivity comes back to you you can send back the cached telemetry to the cloud The runtime support Linux on X64 Environment as well ARM32/ in addition to Windows X The entire application is opensource and it’s available on GitHub As know as edge-agent which is responsible for deployment and pulling down the containers orchestration getting the specification from the cloud on what container should to run these models and as well the edge a hub which allows you to perform the communication from and to the device to the cloud for inter-model communication as you see in the demo today okay many popular engine service can be deployed to add variety at models for example if you like I would like you to perform Stream analytic is in real-time to maybe aggregate your data over a specific time window, you can create that service on azure and export it down to import to IoT Edge Device Similar to our Serverless option on Azure know as Azure Functions, that allow you to create a very easy micro very you to create very easy Micro Services that can be pulled down to IoT Device and get and run as modules let me just come back sorry let’s go from here And if you want to enter in some more advanced stuff IoT edge do have support to AI and Machine Learning modules This come from the popular cognitive services offering like Text Analytics, Face Detection and some computer detection and some computer vision cells vision services, like pull text out of images can create custom vision models that can be trained like to detect anything that you like for example soda cans that can fall down the production lines and deploy IoT Edge Module model that we are going

to see that them today so in more And more advanced scenarios support Machine Learning Projects that you can bundle down as IoT Edge Module It’s important to know, and going over on what we talk about so far If we are interested in building an intelligent video application, we can do that As we discussed are certain things that can be handle by cognitive services and we got this on the IRT model runtime that the back IoT Edge runtime back securely by Azure IoT Hub that allow us high throughput messaging So… we get great ability to go ahead and get started with that unfortunately and the problem is if you if you don’t have some kind of acceleration layer see a good performance of this type of things especially when they talk about a custom visual models or AI models in a especially if you deploy this kind of models on the edge CPU in a CPU the model device so and for this, that’s any video gets online to become a really interesting for this kind of workload this is a small form factor ARM based microcomputer that came equipment onboard GPU that’s a load to axillary from for AI and machine learning and common for different levels that we talk about today is the entry-level The Jetson Nano hat cost around 99 dolars That gives you a 128 CUDA capable core on a device that’s not much larger than a cellphone As you can see here specs for this device is really impressive given this low price point Have 64 bit Quade-Core ARM A 57 running at 1.43 GHz And what is great too that this board come with 4GB 64-bit memory real amazed all so it’s the interface available they have like a CSI connector to connect cameras that you’re going to if you want to do some like video processing As well a gigabit ethernet port among with other things It’s very good way to get started for very low price point and this brings us to the next point if you building out the detector and you are going to deploy to your gets on and device the DeepStream SDK that’s available on the device this allows you to very easily create a pipeline saving up videos inputs and apply any anything that ends or RT workload that you might have on the top of that to do things like let’s say detect objects that might appear on the video screen so perhaps you have an existing model using tinsel or Caffe2 to hear us Python as long you work sparked to one of the compatible models you are going to go for test or RT that includes is the TF T RT uff at the onyx any caffeine model format so if you export your model to one of these formats you’re good to go using you with the jets on on this like you be like you be are immediately available to run inside LT piston applications to perform detection zone envied inputs or of the device but when you look at this lawn with the previous slide you can see that you can perform object detection on the eight seven twenty video screens on just an the eight reliable device so imagine it this is like his motorized smaller than a cell phone you can do eight continues vigorously on justice device so you’re gonna see a couple a little bit of these on the demo today and if you do get some practice today included a OS image libraries api’s samples a developer tools and documentation as long with the configured Linux kernel and any video drivers the heaviness of boon to base OS image that come with the pre populate with you jetpack and provide the following libraries like for example CUDA tensor RT SDK for high performance learning inference multimedia APR for video encoding decoding open CV this is like a popular library for a computer using computer vision image processing machine learning there’s a visual process interface in the VPI library like saviours like a trouble on having style these ourselves and getters open running much faster so I think about when you’re building like

an AI model or video processing model much learn model you want you especially if it’s a developing moment if is a POC moment so you want to go faster with despair at the beginning of this deployment and development and maybe you like you are wondered how are you add and run all this work below this container and how we can get the GPU support into our container and the way that occurs is underneath the hood any video developed is something called any video docker that’s a special form of docker container runtime that may be familiar with a set that is that a special access tab to the runtime what’s happened is that monster CUDA drivers needed by the GPU as well with the host hardware remote container so this happened here in this layer so make it the interface between the docker engine they could the driver and any video GPUs and by doing these eventually allows you to leverage the GPO capabilities inside of the container so that’s that that’s the reason that’s so powerful and you can using the dr. engine in setup in so for the demo today as user case of a soda can manufacturer who wants to improve the efficiency of this plant by detecting soda cans that fell down in the production line we were simulating cameras to monitor each of the lines collect image training a crystalline model with the constant vision which has not called computer vision a model builder so the tactic caster are up or down and then deploy discussed on DJI model to the tip string so let’s get started with the configuration switch screens here hey okay so before we Shack the custom vision the modern day I model the interest part let’s double check first the edge right even here in my ears report though I own a vagator my research grove open their hub here in automatic device management I have IOT edge yeah so as you can see here just have one device has verified from another presentation so now I’m gonna create a new device here to create these I just need to click here add an IOT a device for the device ID offset for the name it jetsam nano 0 0 1 and here I’ll leave everything by default using same activity the ccp’s is just a develop environment I will it’s okay go for this as I met Ricky and you’re gonna use the manual provision here anyway but to talk about this later and I leave everything by default and just click Save and let’s wait it oh ok it’s done so let’s open the device it is just one thing can you just do a little become zoom on in the browser so sure let me come back here thank you okay yeah it’s all good yeah thanks yeah just come back here so just created the jet so no no with the name jet so nanos visible one we set the everything by the fall using a same at the key we’re gonna use this data so I will open here the Jackson on my device yes as you can see here we just let me scroll down yeah so we have here a two device the edge agent okay and the edge home after we deployed there or like AI innovative history model the model should appear here so here the list of the models should display all the models that we deploy on the I Archie at the box ok so let’s jump to the custom vision a hammer here open my custom vision cast a vision thought AI this is part of Microsoft let me just show these inside of my research growth so here I have my custom vision create my workload great here and this is the page ok so here I have my project called soda cans and this project have all the image of my soda cans tag as up or down so let’s double check the images that are up as you can see I wanna take my image let me open one image so you can see is you can easily tag I’ll tag this image close here I’m actually tagged images and you can see that’s the custom view interface is

really good because I already found it find the image that I want so I click here set the tag to up that’s it let’s check now the images that are down so you can see here so now this process here is manually guys so you need to come to training discussed on vision you are you you are labeling and tagging the image man a bit so here just to illustrate I have one image here with the sonic hand down okay so after we tagged the image is really simple we just need to come here click tray you’re gonna use the quick training looking train and this you take like a few seconds let me show here a little bit of with these other interaction that we need before so here we have three three things so we have precision recall and man okay so precision is very show if a tag is predicted by an our model is how it’s likely that to be right so you can this is just a developing so is 76% precision this model okay and here we have recall the recall where is the number we tell you I’ll tell you out of the text which should be the predicted correctly and what is the percentage that your model for active fine so the sixth experience sixty-six percent it’s not too bad it’s just a small data set so that’s the reason and we have the map that is the mean average precision this number view you tell you the overall object that actor performs across all tags okay let’s see if strain is too runny let’s wait a little bit okay if you take more longer can just exporting here okay yeah stucco is taking longer than I expect so let’s just export my last interaction does not change that also tracks part what you need to do it just click here and the performance Lytton export if you recorded it the models that we mentioned before so it’s possible to use inside of the jets Inanna if you can use that onyx model you can use the pencil or the this case are usually onyx so select onyx and click here download and it’s downloaded here my model is really small model as you can see okay so we’re not gonna use this model right now so now let’s jump back to world our IOT device okay the first step here we need to compute the the iot edge on the IOT device so let me jump back to the my jets on nano server here right it’s Alanna so I will look in here okay good so it’s very straightforward the wage of your daddy I don’t they are to the advanced Microsoft provided system by step that you can convert this like and on bone to machine on aversion to 18.04 or no windows machine is really sleeper so the first step is to handwritten Microsoft key and software poster feed and I’ve set some comments here so I will not enter too much in details so it’s just a couple comments and by the end of the presentation there’s the resources there if you want to see more details about these you can check the links so first one here I come on here the revision of the pretty lights oh man okay okay so that’s the first step now we need to install the IOT add security demo on the server so first thing we need to do is a sudo apt-get update that’s this wait a couple seconds it’s done okay now the installation it’s really simple sudo apt-get install IOT edge okay let’s wait a little bit some things stolen all dependencies such as the three years okay it’s done and now we need to

configure the i/o to you at runtime to link the fiscal the files with the device identity that exists in the a droid Europe so to do this we are using for this case the manual provision so what we need to do is first it did the configure llamo we can do this by sudo no no let’s get the big yellow here yeah okay and that’s found here the manual provisioning okay it’s here so come back to the portal here we just come back to the my research group I will pin the energy hub let’s find out the ihe edge open the Jets on nano ok I use here than my primary connection string so this is just that – parameters no problem display – you guys this gate and here is here okay so let me just save here I just sat in the provision intro manual in the device connection string here yes yeah and that’s it okay so now we just need to start this sista the desserts are you can do this is really simple sis then restart our Yoji edge okay so now we need to check if everything is fine let’s check the status of the IOT demo so another common here system city status I or key edge okay yeah the model agent was created that’s fine that’s very good now we need to check in the connection the good part here the this is modern what they are key I didn’t have come with some building operations actually between monitoring for help you so what we need to do is just to the IOT edge check and this will run a check for network operation neighbors so you go as you can see there’s some warnings and somewhere else because you check if this the way you are running the IOT edge is a proper way to run let’s say for a production environment so let me just give you some of the day as for example here some DNS server gave hers the production readiness and all other kind of things but the important part here if the connectivity check everything it’s okay so you can you can say that okay everything is fine and my device now it’s communicating with the address that’s the important part so okay let’s check if it you have already any models running on side of device so for this is an pseudo I hope she at least release all my models that good so it’s runny daddy agent so the thing is where is the energy at home the edge home you just appear here when you deploy your first model because this mod is one of the responsible for the model deployed that’s the reason is just add g8 into running here so well my configuration for the energy device is almost done now we just need to copy the custom model that we generate on the coast on visual service and let’s say to not lose any time I woke up this model through onedrive that I can simply run some comments so first comment are you open my folder deep string here this is my tip Street folder okay inside of Market District folder have my custom models and let’s download this first okay and downloaded it my custom visual model good now we just need to select

the mode inside of the folder hey you have the models nice so that’s another solution except that you need to do is for the deeper strange understand how to parse the bounding box provided by the model the custom we don’t be so modest justice Pacific in the coastal region we need to download the next state library called Yolo so let’s download this library here just your cure to remember guys this is just a develop environment so all these steps may be a niche research is inside of us the ICD pipeline and it means don’t convey today is that to be able to to deploy this inside of the container to not doing this a lot of minor steps that we hear okay I just try to give a like an overview what high def what’s happening the background okay now we need to download the derive features that we’re going to use to simulate the camera so I think I would come back to my micro stand strings you download the video is a bit common so that’s why I’m not typing it’s fun okay downloading nice okay so now we just need to extract these videos to the folder yeah so I have three videos here okay so we almost there let’s go now to the visual studio here I have the my visual studio here I have like my deployment a plate for this tip is three Nvidia so when you get started with this with the documentation that is inside of at the resource by the end of the presentation this like is the base for the deployment it’d be any video tip string inside of the RT device so there’s a couple of variations here so the way that to communicate the entry points for example the configuration I’m not even remotely today is about things that you need to do inside of the ID ID should run the specific to the modeling side up there to edge and if the things that we need to change here is the bindings so I adhere the bindings it’s here basically what I’m saying here with this Minds the same to my IOT edge model to take a look of for these configurations let’s say here I have my custom configuration the custard stream that my videos that we need to read the videos and the custom model is the actually the model Onix temperatures down load the way that we go to the visionary at the coast envision expert you need put here so the next step is to deploy or actually deploy the day IRT model so let me just check here everything’s okay by Jetson are it’s here that’s good so first let me create a deploy manifest so what you need to do is after we just change here plus okay it’s fine just right-click here generate our GI deployment manifest okay it’s generated so how to deploy this I here in her hearing configure have to the blog manifest I just need to go here right click again and create the deployment for single device and here inside dr. George although I show my device that way we sign up there to hub I selected it yet so now and okay deployment so see what’s happening right now in under the hood is ihe ID is saying so that’s all now run this model it gets all done now you just download the information and build this model so let’s leave running here the monitoring for the four messages that my let alone is send it to the cloud this is like take a little bit like expecting some few seconds but anyway okay then let’s check the list of list of models running okay so I already have my inhibited stream they hope it’s here as well and why we need to do now is to check if my mother’s money are you open the VLC and

I would check if the job is running by connecting to the device using the VLC and getting this network extreme let’s play here ok so it’s already running if you not seen the images just let me know about what’s happening right now is this bounding box and the videos are the jackson and looking at the video and looking for the the can see for the cancer up or down this video is in looking that’s why but is going back and connecting and checking the bounding box love or down oh let me check the message mean same okay so now we already send a message to the club as you can see here my Jetson oh no it’s already saying that the masses in say okay he found an object this is the location of the object in the image in if it is very much today they can’t it’s up or down so okay so the VLC okay that’s impart its own now we are visualizing here this video is three real-time video streams with the coast on the additional model that would like to just build in minutes to detect accustomed economics for this example you are using three simulated cameras again using three video files but of course that you can use a real-time cameras visual studio here you can see that the … possibly the possibilities are here army so you can for example now using the information that we were sending through the energy hub connecting string analytic is connecting on Logic II apps and you from there you can build your line of business application it’s really straightforward and visit run a I custom visual solution using the Jetson Anna and the tip is three SDK okay okay okay with that I think I conclude my day if you can we can jump now for the questions if you have any questions just just hit me yeah we are here a question theater so Doreen asks is it possible to have a container image with edge pre-installed it’s possible to have a container image with the lg4 installed no you need you to insert these in fact some kind of that the pipeline of distribution so what you have here for the present the most stake here so when you deploy when you build it like your model you need to actually when you go to the vise provision so sorry about it so when you have here the a variety you have the devices provisioning service and using the device provisioning service you can say to the vice president sets to deploy this point this energy container inside of the device and from there you start to build upon taneous insert inside of the object so the process is based with that so have it so you have the device you build a pipeline that device connect automatically to the cloud and from there they go back to device provision just able to die or create a container that yeah it’s clear Donal yeah apparently we have no more questions here I actually have one question for you just one shirt in which kind of scenarios people can understand if they need to use a IOT edge solution or anything else I think when you want to avoid any kind of sending too much message that’s the simple scales ever so let’s say for example as I may show as I mention for example you want to say you are in the ship the middle of the sea and you have a limited connection with between your device and the cloud so is the simple changes you store all the messages inside of the IOT device and you find in the connection you’ve seen the telemetry for the cloth another case for example if you want talk about send two more to the Lima to the to the what you need to do is basically you are grenade for example using the error function to just three here when you like let’s say you are in a factory and you’re monitoring a tractor or car or something like that at some point to watch monitor the temperature so if they hit this limit you say the measure its message to the truth to the cloud so that’s the good part about the day they are Chad so you have this model inside of the device and avoid any kind of telemetry and processing all the time send message to the clock ok perfect we have via another question so the

question is is there a vehicle in the video I can see the vehicle message being sent in the logs exactly I saw that as well so what basically have here because the any … couple of others let’s say models inside of this the SDK this model any video so what’s happened here I just found out this last week and I said I’m today to be the developers to take a look at this but basically he’s just looking at this image just you find out like a car and using another model inside of these things they okay this area but it’s not inside of the custom vision it’s not inside of any kind of my data set today so for sure is a is a bug inside of the bitter bitter spring that’s the reason reader all right can you just jump quickly back to the to the side so I can just yeah yeah so just what you know in case you did not have a chance to do it in the beginning of the session I would like to ask you to register to this session by using this opportunity provided by Microsoft to access relevant materials related to this session in order to do that you just you can scan the QR codes or even use the AK.MS link and you just need to fill in a quick short form and you will have access to these materials and it’s also important for us because we can justify all the efforts you know to this initiative so we just take a minute and register to this one we we have another question here in bitter yeah so a little bit till relevant to this topic what is Victor’s opinion about Jetson Nano I I think it you get another one of the best advice so far if you want to plan if you want to implement any kind of machine learning or constant vision on the IOT iDevice I think it’s like it’s this is like it is really something that ingredient where you can take your head about this and it is my professional opinion if you like to start work with the custom vision AI models machine learning on the IOT device is the best you can you can do so far so it’s really shape is $99 you can have this this device I think I have the box here so you’re gonna receive some box like this size and the developer could program through any video it’s amazing you can use a lot of materials from there and they have full support to connect with Microsoft and Microsoft Azure so go for this and it’s amazing perfect I believe we are good to proceed to the next slide little yeah so you can present your series of sessions yeah so I just put here some resource for you the first one any video gets on any developer key the one directional for you access the site it’s amazing the developer again the developer program for any period some is they have they almost good as the Microsoft Docs they have everything they have an forum the people who are supporting all the time there so if you ever really interested on developing tips training things at tensorflow AI machine learning with the IOT device the GPU go from there it’s amazing so they again they haven’t missed developer program and this this this demo it’s available is not my demo was produced by Microsoft in the air samples you can go there and take a look how the guys are deploying this same but instead of using said that soda cans factory using the detecting cars using the same device but instead of just three screens losing 80 screens inside of the same individual that’s all none so again you have on a showcase there and you can get started with the nrfu herb and Archie had it’s a really amazing products as well and as Ray if you Center is not here but you can do it’s all part of the ecosystem of the of … okay so we are preparing for August and IOT series that you help you to understand much more about measure and start to develop its solutions and get ready for the AZ 220 that’s the Microsoft education for IOT developer first we’ll have an overview on IP services on this session do you become familiar with I should I usually service what the elements to build the development environment how to connect the device with the air github taking the first step so like I’m just

journey should become the area to developer the second session is connecting our two device Roger on this session exploring registered IOT device to enjoy a few hub how to run a preview simulate device and check that the lam percent by this device the third set session is this setting up the ihe device so I will enter more details about this at device and how to connect this and they’re gonna use their two device that can monitor the temperature one of the machines and deploy ice tree and on HP’s model to populate in the average temperature and send the aller to their device at tracked equipment so if you want more details on our agenda as is a session that must connect and the last I shall be create a Nazarite to Central Coast on app a device template and will simulate appreciate the truck with the route selected by every maps using these are stood code and also monitor and come on simulate device from there to Central vegetable Colin wilting there is a guy there is a typo just for people to know there is a type on the first stage so it’s 7th of August and the easiest way to memorize the dates of this session Z to be all Friday all Fridays in August so every Friday it will be IOT sessions basically you know she Friday yeah IOT Friday okay I just I just I just fixed year so yeah standing by 7 of August we have this out to sessions and we can do some kind of quiz by the end of each session that you can exercise and and to get a comfortable with the questions inside of the xn so you’ll be really fine so with that thank you very much guys for having me today and join me today if you again if you have any questions just which reach me out on social media on Twitter on Linkedin okay so just before we finish this session I just want to let you know that next week we’ll have another great session and this time we do n network delivering the second session of the azure solutions architect certification series so this session will be related to infrastructure as a service on Azure so if you are studying or planning to study to this certification then don’t miss this opportunity to attend a session with Duane and one more about how to be prepared to get this certification the link of the session is on the shelf so you can register to the session and don’t miss a it out and thank you Victor for delivering this great session and take you everyone to join and dedicate your time with us on this session I hope you enjoyed this session and I couldn’t interject a link for our survey session survey so we really appreciate your feedback so we can improve our sessions and deliver future sessions related to topics that are useful and of your interest so ever great day and thank you once again and happy Monday to everyone thank you see you next time thank you