Returning documents as one file back from Azure CosmosDB – Part II

In my previous post, I showed, how to get all documents of one collection into one single JSON-file. The best reason to do it this way, is you can use a browser, being independent from the Hardware. But that has a drawback in some situation. If your documents content is to big/large, than the Data Explorer cannot concatenate all together, wherefore it results in paging. ..Not what we want, bad!

There is another option on getting all documents from CosmosDB into one JSON file. The “Data migration tool” for Azure CosmosDB. It’s pretty easy to install and use. Just hold your connection string ready. I will not duplicate content, that is covers more details on, how to use and what to fill in, to get the right output. So, take this post a hint to this tool. I used it very often for various tasks, why I like to recommend, to have a look at this. Visit here

If you have experiences with Data migration tool, let me know and leave a comment.

Returning documents as one file back from Azure CosmosDB

If you ever tried getting multiple documents from CosmosDB collection as on “file”, than you should do the following steps. There can be different requirements for this, but this is not scope of this blog.

Option I – directly in Azure Portal

(Option II – using Datamigration tool)

Here is, what you can do:

    1. First open up “Data Explorer” in CosmosDB Resource (Azure Portal) and select DB->Collection of your interest.
    2. Click “New SQL Query” -> this opens a new query pane with a select statement and also “Setting” to open up the settings SlideIn.

      new query pane
      new query pane
    3. In that SlideIn you have several options.
      – one is to set a custom value on how many results can be displayed in one page
      – another option is to remove the limitation of default 100 to unlimited

      unlimited page results
      unlimited page results
      custom page limit
      custom page limit

       

      ! Don’t forget to save !

    4. Next, go to query pane and leave the select statement, if you have no other conditions on the outcome. Click “Execute Query”. With this you can now grab the output in the results pane by CTRL+A and CTRL+C

      execute query
      execute query
    5. Open your favorit text editor (I like VScode very much), paste in the clipboard and save as a file.

Here you go! 🙂 Hope this helps. If you have other suggestions, you’re welcome to add comments.

Getting from CSV to JSON is pretty easy

This is a short reminder for myself, but could be helpfull for someone else out there. Nothing new!

I often look for easy solution, that resolve “easy” problems. This is one: “Convert something to something”.

After looking arround in the outer spheres of Internet, I realize, that OoB tools can do the job better or at least without any dependencies or such things.

So, how to convert a regular CSV to JSON? Take PowerShell!

It’s easy, isnt’it 🙂

DevOps in der IoT angewendet – Ein Bericht von der buildingIoT

Ein Bericht über meinen Aufenthalt auf der Konferenz buildingIoT.

Vom 3.05.17 bis 5.05.2017 fand die buildingIoT Konferenz in Heidelberg statt. Es war ein Format, bei dem “Entwickler” und Gleichgesinnte sich zum Austausch über alle erdenklichen Themen im Scope des IoT treffen. Es wurde über Technologien, Prozesse, Erfahrungen und die Themen der Digitalisierung gesprochen. Dabei entstanden auch die einen oder anderen guten Kontakte bei den Get-Together Events.

Für mich war der Besuch hier ein besonderes Erlebnis, da ich das erste Mal vor einem Publikum mit so einer hohen Bandbreite an Erfahrungen und Kenntnissen ein Referat abhielt. Mein Thema war “Wie DevOps in der IoT-Entwicklung aussehen kann”. Meine Nachricht war aber eher “Wieso DevOps die einzige Wahl bei IoT ist…”.

Der erste Tag der Konferenz begann entspannt mit einer ganzen Anzahl von Workshops rund um das Thema IoT. Vom Beginners-Guide bis hin zum MQTT-Deepdive gab es eine interessante Auswahl über den ganzen Tag.
Am Abend wurden dann alle Referenten, Sponsoren und Organisatoren zum Essen geladen. Mir kam dies nur gelegen, da ich so auch andere Referenten kennen lernen konnte. – An dieser Stelle möchte ich gern noch einen Gruß an Steffen und Niko entrichten 😉  war toll euch zu treffen; unsere Gespräche haben mich bereichert. (Das Restaurant Tati kann ich übrigens sehr empfehlen – dort fand unser Treffen statt).

Am Donnerstag war es dann auch für mich soweit. Nach der Keynote begann meine Session mit 70min DevOps in der IoT.

Es war super… der Raum war voll, niemand hat den Talk verlassen und am Ende belagerten mich noch soviel Interessenten und Fragende, dass ich die Mittagspause auch glatt nicht mehr geschafft hatte (Das ehrt mich).
Was mich aber am Meisten gefreut hatte, war, dass selbst meine Demo, die aus Coding, Builds, Deployments, Backend in Azure und selbst meine IoT-Hardware mit WLAN-Anbindung einfach funktionierte. Ich kann mich also wohl zufrieden schätzen. 🙂

Im weiteren Tagesverlauf habe ich dann noch die unterschiedlichsten Themen verfolgt. Zum Beispiel, wie Spracherkennung in Geräten umgesetzt werden kann; oder Digitalisierungs-Stories mit Opitz Consulting.
Zum Abend hin gab es dann noch reichlich Austausch über allerhand Themen und Erfahrungen der Teilnehmer untereinander beim üblichen Networking im “Get-Together” mit kleinen Häppchen und nem Bier 🙂

Leider musste ich am Freitag auf eine firmeninterne Schulung, weshalb ich dann die Konferenz leider frühzeitig verlassen musste. Dennoch würde ich gern nächstes Jahr wieder vorbei kommen.

An dieser Stelle auch noch einmal von mir einen herzlichsten Dank an das Orga-Team und die Sponsoren. Das habt ihr klasse gemacht.

Beste Grüße
Thomas

Accessing sqlserver instance with CommandLine

Working with local SQLServer can sometimes be challenging, if you don’t have any tools, to access a database. For administrational reasons it could be helpful, to gain access to SQL Server you can simply use the commandline cmd.exe or powershell tools. This is nothing new, but I think, it is not so common.

So, to start open up cmd.exe and type for example

This command opens (-S) a trusted (-E) connection to you local instance of a SQLLocalDB 2012. Note: this command ist case sensitive.

Than cmd prompt for further commands “1>”. Here you can type T-SQL statements like

This prompts for a Terminator for example (GO + <Enter-Key>).

After this your SQL Server instance runs this command and results with a number of databases, that are attached.

If you like to know more about read here: https://msdn.microsoft.com/en-us/library/ms162773.aspx

Beginnersguide – Azure IoT Suite

For those, interessted in doing some really awsome things with things, I recommend having a closer look to Azure IoT Suite.

It is a kind of website, that enables you to get ready with IoT in minutes. Azure IoT Suite applicates all IoT capabillities of Azure Cloud. In form of web application, that IoT Suite offers, you can dive into world of IoT.

But let’s see how to start…

This guide shows, how to create an work with Azure ioT Suite.

And here are the prerequisits:
– Azure Subscription (use youre MSFS Account and register for a 90 day-free subscription)
– maybe some devices, if available (it’s not a must)

 

  1. First hook into https://www.azureiotsuite.com/ … register or sign in
  2. Next, you see the the followingimage
  3. to proceed click on the tile with the big plus on it
  4. As the next level shows, you have now two options to proceedimagehere you can either select to get into a predictive maintenance solution or into remote monitoringWhat are the differences? The “predictive maintenance” concept is based on evaluting data with machine learning, to predict issues of monitored systems. The “remote monitoring” solution contains of dashboards and monitoring tools, that also enables specific device management.
    First, I would recommend starting with “remote monitoring”, because it is easier to go for a start. Machine learning is, made really simple with Azure ML, but as topic, it is still a complex one.
  5. So click on “Remote monitoring” and enter all necessary detailsimage
  6. After you clicked on “Create solution”, Azure IoT Suite starts the deployment process.
    What it really does in background is simply gathering the sources for the WebApps and –Jobs from GitHub (https://github.com/Azure/azure-iot-remote-monitoring) and starting deployment scripts from there.
    So, if you like, you can go directly to GitHub, grab the sources and start some powershell scripts/ batch-files.
    Here is a hint: If you check the picture in step 5, you can see the provisioned components for the IoT Suite App.
    Look carefully to the SKUs (stock keeping units). IoT Hub is set to S2, an App Service with P1, another with S1 and also storage with Standard-GRS.
    Theses SKUs aren’t that cheap. So after creating the “remote monitoring”-solution, you should go to the different services and lower the units.
    imageimage
  7. Now Azure is creating your solution
  8. Lastly, you have to accept some authentication and access requests. On successfull clicking Smile, you can launch the app:image

Lowering prices

…and here is how!

1. First take IoT Hub. Go to http://www.portal.azure.com locate your Resource-Group (in my case BlogTT2) and click on IoTHub (“BlogTT2xxx”)
image

image don’t forget to save

2. next get to the App services and switch to a lower SKU like following example shows

image

3. also check storage. This is a big cost, so reduce it to a LRS SKU like in the following picture

image

 

4. With these tweaks you can reduce the cost from over 100$/month to round about 50$

 

…Hope you got everything right. Play around and get comfortable with IoT 🙂

How to change hosts entries on network changes

Why?

I am changing some special hosts file entries according to different networks, I am connecting to.
One is with Direct Access (DA) into my companies’ network, that works with IPv4 address resolution. The other one is directly connecting to the office’s network, using IPv6.
So I have to change my entries by hand, to reflect address resolution, every time, when I am changing between these two networks… and that is really annoying.

After a couple of years (right…! doing this day by day with the goal, finding a way out next day (every day), I passed over years ) and many, difficult to track, issues, that could easy by solved by not forgetting to change theses entries in hosts file, I tried to find a solution….

…and here it is!

It is that simple, that I shouldn’t post it here, to save my face 🙂. But I think, I am not alone with that.

Windows Tasks Scheduler is the key. So, let me explain:

Think of a simple hosts file entry:

 

(Currently I am connected by DA over IPv4, therefore I uncomment the second line and commented out the first one.)

For automatically changing these settings to the invert, I can (and of course, there are other possible solution) create a Tasks, that runs on detected NetworkProfile change. With a little powershell script the right settings in the hosts file will be modified. So have an eye on the following instuction:

  1. Open Windows Tasks-Scheduler
  2. Create new Task by right clicking somewhere in Task Schedulers tree
  3. In next dialog enter a Name for the task
    1. (it’s up to you, to decide, whether to choose running with logged on user or not)
    2. Making changes to hosts file is only with administrative privileges possible, so click “Run with highest privileges
    3. Also set the configuration in respect to the running machine
  4. Switch to tab “Triggers
    1. Select at “Begin the task:On an event
    2. Search for Microsoft-Windows-NetworkProfile/Operational at “Log:
    3. On “Source:” select NetworkProfile
    4. As “Event-ID:” enter 1000 (means “Network changed”)
  5. OK… coming slowly to an End…
    1. Switch one tab further to “Actions
    2. Leave “Start a program” in “Action:
    3. Now we want a Powershell script to be triggered on the networkchanged-event
    4. So… add as “Program/Script” name PowerShell.exe
    5. As “Add arguments (optional):” add script’s path
  6. If you are not willing to spend the time into selecting the right event, but want to achieve the same result here is a simpler solution
    1. Switch to tab “Conditions
    2. Select last checkbox and choose the network that triggers
  7. Save everything an go to next step….

The triggered powershell script

Now, everything is ready to run something an the networkchange event. So here come the script (I do not have to mention, that there are more elegant ways):

(again my example hosts file)

In the first line the script retrieves the network profile name (the name, that is also listed in that combobox in picture 6.b.). In relation to the name of the current netw.profile, all lines in the hosts file, that contain the string “wlan” will be uncommented and others with “home” commented.

The result after network switch is :

Making ActiveMQs Topics virtual for use with Queues

The problem

Before I start explaining the core part, I will tell you the reason, why I am posting this.

or directly to the solution

Everything started with RabbitMQ 😉 in a major enterprise IoT-project for a well-known german refridgerator producer, that has been mentioned by Satya Nadella at the Hannover-Messe in Germany in 2016.
The goal was, to bring smart devices online with a common messaging protocol. But because of some issues with the broker (handling some sort of certificates), the decision has been made, to replace RabbitMQ by ActiveMQ.
The problem (technically) with that decision was, that all sinks of messages send to the broker with MQTT, will be handled by ActiveMQ as Topics. This leads the whole communication architecture to be not scalable.

Please let me explain some details regarding Scalability and Topics vs. Queues…

If you like to have an IoT-Architecture, that is scalable, than you should enable your environment to handle as much messages as possible in parallel. To achieve this, you have to decouple logics/functions into Worker-Units. With this, you can higher the number of parallel worker, that can then consume messages parallel. As you can see… this is scaling! But, what is very important, to point to, is that these workers can only consume messages parallel, if the source of the messages (where worker consume messages from) delivers messages with FIFO pattern (or one in one out). This also known as queuing concept.

Queues
Queues
are constructs, that allows consumer, to handle only one message at a time. With that you have a concurrency enabled environment, where all consumers are “battling” for messages. So with this pattern, you can keep queues “empty”. And, if there is more load at the system, then you can scale by awaking some more instances of the same worker.

Topics

The other, let’s call it “message handling strategy”, is Topics. The idea behind this is, to have multiple different worker consume on topic-related messages. Think of the following scenario, you have a smart device, sending notification, alarms and other emergent messages; But also it sends some logging/monitoring messages. Further you like to have some backend modules, that handle these different types of messages in a different manner, then you need Topics. A message would then be routed to either a topic called “Alerts” or “loggings” or whatever…. With that strategy you deliver one message to all the consuming workers listening on the same topic.

The problem with change of broker

Maybe, you can see the main problem with the broker-change… after changing from RabbitMQ, that handled all the messages send by MQTT protocol in queues, the concept changed to topics, because ActiveMQ handles MQTT messages with topics-Strategy.

[Quote: “ActiveMQ is a JMS broker in its core, so there needs to be some mapping between MQTT subscriptions and JMS semantics. Subscriptions with QoS=0 (At Most Once) are directly mapped to plain JMS non-persistent topics. For reliable messaging, QoS=1 and QoS=2, by default subscriptions are transformed to JMS durable topic subscribers. This behaviour is desired in most scenarios. For some use cases, it is useful to map these subscriptions to virtual topics. Virtual topics provide a better scalability and are generally better solution if you want to use you MQTT subscribers over network of brokers.“, https://activemq.apache.org/mqtt.html ]

And that makes our architecture become inefficient/ not scalable. One possibility to get out of this situation, is to change that smart device to send by AMQP (for example), but this would mean, to change the whole project plan and with that time to market. Or you make some “magic” configuration stuff at brokers side J… as a second solution.

The solution

How to change Topics to virtual topics

As a developer in the eco-system of Microsoft (Visual Studio, C#, .Net,….) it is not so trivial, to get into this Broker and understand the different concepts, “languages” and further “strange things” that are used with ActiveMQ. Also the documentation of that broker is not that detailed and easy to understand, if you only want to use this product. (but that is understandable, because no one should use it as an OOB-Tool). But I think this solution here could be found useful for other devs, like me, that are only looking for a solution to get around the problem, I explained before.

First: Config

Somewhere in the install folder of ActiveMQ…

There is a “conf” folder, containing the activemq.xml. There you can find your allowed and configured connections in the section transportConnectors.

In the excerpt below, you can see the line, where mqtt protocol is enabled. This line gets extended by ActiveMQ parameter for subscription strategy (see below). https://activemq.apache.org/mqtt.html

If you haven’t already, than you should fix you code as well, to read from queues:

  • Create consumer for Queues instead for topics and pass the new path-pattern…
  • With virtual Topics the name of the queue is changed to a pattern as follows. Consumer.Application.VirtualTopic.QueueName.
    A example… If you send a message to topic Alert by the pattern smartDevice.number.Alert this will change to (from Broker or consumer perspective) VirtualTopic.smartDevice.number.Alert
    That is then consumable by that path(e.g.): Consumer.AlertWorker.VirtualTopic.smartDevice.number.Alert (means, I have a worker (or multiple instances) handling alert messages from the queue “Alert” of a specific device).
    You can also use wildcard characters like * e.g. Consumer.AlertWorker.VirtualTopic.smartDevice.*.Alert (means, I have a worker (or multiple instances) handling alert messages from the queue “Alert” of all devices).
    .. and so on.

I hope someone can make use of this solution, otherwise it may lead to another. Please share your thoughts!

Update für OneDrive-Client (Win10 )

Der OneDrive – Client ​hat kürzlich ein Update erhalten. Danach kann man nun endlich ein zusätzliches Geschäftskonto lokal synchronisieren. Das hat den Vorteil, man bekommt Alles an eine Stelle und kann es über eine App steuern.

… und so geht’s …

  • Öffnet das Menü des OneDrive-Clients mit einem Rechtsklick auf die kleine Wolke in der Taskleiste und anschließend “Einstellungen
    • Dann auf den Reiter “Konto” und unten (wie im Bild) auf “Geschäftskonto hinzufügen” klicken

    • Dann meldet euch mit eurem Geschäfts-Account an (ggf. passt ihr noch den lokalen OneDrive Ordner an)
    • Im Anschluss könnt ihr die zu synchronisierenden Ordner auswählen.

    • Wenn Ihr dann den nachfolgenden Screen erreicht habt, dann ist alles eingerichtet. (Bitte nicht vergessen: abhängig von der Masse an Daten kann die Synchronisation etwas dauern und ggf. euer System etwas “stören”)

Chocolatey – Error: Collection is read-only

If you are a Windows Insider and received the latest Build 14331/14332 than you may have some troubles getting choco command running in a Powershell. If you try install a package – it doesn’t matter which – than you will get followin output:

image
(example of installing netscan)

What happend… a bug in Powershell. So, if you like to avoid this, than you have some options Smile

1) Go back to a stable previous Build

2) Wait, until everything gets fixed

3) do the following:

Open file “C:\ProgramData\Chocolatey\helpers\chocolateyScriptRunner.ps1”. Replace
[alias(“params”)][alias(“parameters”)][alias(“pkgParams”)]
in the header of that file by
[alias(“params”,”parameters”,”pkgParams”)]
as a quick fix. After that, you can try your install once again. It should now succeed.

Please see this post, where I got this fix from, for details: https://github.com/chocolatey/choco/issues/659