Jekyll2022-02-08T22:37:26+01:00/feed.xmlDOMINIKBIAL.DEDominik BialVideostreams as input source for the Internet of Things2019-06-30T00:00:00+02:002019-06-30T00:00:00+02:00/2019/06/face-detection<p><strong>(republished on <a href="https://www.linkedin.com/pulse/videostreams-input-source-internet-things-dominik-bial/?lipi=urn%3Ali%3Apage%3Ad_flagship3_profile_view_base_post_details%3BSiXxsp0gReuMXxz%2B1Evg0A%3D%3D">linkedin</a>)</strong></p>
<p>Nowadays the Internet of Things (IoT) is a huge cosmos with many different sensors and actuators. These promise deep insights into processes and possibilities for control. The combination of different sensors allows the identification of specific situations or anomalies. However, such combinations might be extremely complex. This often means that costs or constraints might hinder the introduction of IoT and dedicated sensors. Additionally, these sensors might not be attached to existing solutions which means that they are not suitable for retrofitting scenarios. Then it might be time to think about cameras as an alternative IoT sensor.</p>
<p>Cameras are cheap but powerful. If you think about the amount of information which is already transported with a single image, animated pictures are even more powerful. There is a lot of information that can be communicated, for example, movement, state changes, objects and even emotions.</p>
<h2 id="cameras-in-tesla">Cameras in Tesla</h2>
<p>How many information is transferred via a camera is demonstrated by the autopilot functionality from Tesla. Most manufacturers use a combination of depth and distance sensors. In comparison to them Tesla bases its autopilot mainly on cameras, video streams and software that analyses the stream in real time.</p>
<p>The following video on youtube gives an impression how the camera based input looks like.</p>
<iframe width="560" height="420" src="https://www.youtube.com/embed/V4PDTD2VHSU?color=white&theme=light"></iframe>
<p>Tesla’s general idea is that cameras provide enough information to steer a car autonomously if a 360° view is given. As a result, the software becomes the key component in their solution as it has to understand the video stream. It is the most important part which needs to be adapted and made intelligent enough to react to different conditions. From my point of view this is a very interesting concept as this might result in greater flexibility. The software can be simply updated and no additional sensors need to be installed in the car to improve the autopilot performance. However, the upcoming years will show if this is the right approach.</p>
<h2 id="ai-drives-image-processing">AI drives image processing</h2>
<p>AI makes it possible. During the last couple of years the area of computer vision made huge progress which was driven by machine learning and underlying processing algorithms. There are a lot of preprocessed machine learning models on the market for cloud and on-premise settings that help to extract information from video streams to realise different use cases. This means, that you do not have to train your own AI.</p>
<p>Here are a few examples of IoT use cases which can leverage from camera input streams:</p>
<ul>
<li>Detection of the amount of people in a room</li>
<li>The amount of visitors or passers-by of a shop</li>
<li>State changes of a machine if control lights or movements are tracked</li>
<li>Pathing recognition and tracking of goods</li>
<li>Counting objects for example produced goods</li>
</ul>
<h2 id="prototyping-and-demonstration">Prototyping and demonstration</h2>
<p>We had to run a workshop which is why we restructured our knowledge about AI and built a few things for demonstration.</p>
<p><img src="/assets/blog/benniInAction_delayed.gif" alt="facedetection" title="face detection prototype" /></p>
<p>We used the Python programming language in combination with OpenCV which is an open source library for computer vision. OpenCV contains already pre-trained machine learning models. Furthermore, additional ones can be found in the Internet or directly by OpenCV.</p>
<p>We used the Haar-cascade algorithm to identify faces of a camera input stream to determine if there are people in a room. By positioning the camera at the entrance to our offices we can now tell if the office is empty or if colleagues are already in.</p>
<p>By placing a second device in front of the coffee machine we can actually tell how often the coffee machine is visited. Although these are simple examples, they show the possibilities if you apply such a device, for example, in production.</p>
<p>To demonstrate and check the implementation, we added green rectangles around the recognised faces and recorded the manipulated video. An extract of such a result is shown above.</p>
<p>All in all it took us 2h to get everything set up and running and we were able to gather information from our environment. From my point of view, that is impressively efficient and only possible due to the great and open minded developer community.</p>
<p>The example shows the powerful combination of AI and video streams to detect specific situations in the wild and that it might be worth to invest some time into this technology to evaluate opportunities. It might help to broaden knowledge about processes and machines and, therefore, might generate new insights.</p>Dominik Bial(republished on linkedin)Computing on the edge – our S1 industrial server2019-06-30T00:00:00+02:002019-06-30T00:00:00+02:00/2019/06/industrial-server<p><strong>(republished on <a href="https://www.linkedin.com/pulse/computing-edge-our-s1-industrial-server-dominik-bial">linkedin</a>)</strong></p>
<p>Over the past years most companies developed a cloud-first strategy and migrated IT systems from an on-premise setting to the cloud. Still, many of these projects are ongoing or are newly started. There are various reasons to drive for the cloud - a few examples are:</p>
<ul>
<li>A company does need to invest into hardware to run their own applications. The capital is not tied up.</li>
<li>Separation of concerns: A company needs to have their applications up and running but is not really interested in the underlying hardware. The applications provide business value and probably differentiate the company from competitors while the hardware is a resource.</li>
<li>Making use of the expert knowledge of cloud providers instead of reinventing the wheel.</li>
<li>Time to market is faster.</li>
<li>Flexibility as resources can be simply booked when they are required.</li>
</ul>
<p>Generally, the cloud is a highly optimized and integrated combination of computers and services. However, in some settings the cloud has some disadvantages. External applications communicate with the cloud via Internet technology. This means that system landscapes and applications are subjected to the limitations of the communication layer. These are:</p>
<ul>
<li>Latency</li>
<li>Bandwith</li>
<li>Connectivity</li>
</ul>
<p>In highly distributed environments, where a lot of communication takes place and data needs to be exchanged fast execution logic and analytics in the cloud might not be the best solution.</p>
<h2 id="edge-fog-and-mist">Edge, fog and mist</h2>
<p>The scenario described above can be found quite often in IoT and Industry 4.0 scenarios; especially, if you talk about production and machines. Latency can be crucial to stop machines immediately if some technical mismatch is found. A complete connection loss would result in a downtime of production which is extremely expensive.
This is why there is the tendency to handle data and to execute logic as close as possible to manufacturing.
Edge, fog and mist are terms which describe the tendency to have computation power in manufacturing. They have more or less the same meaning: an additional layer between machines, sensors, devices and the cloud. This layer contains software to run, for example, stream analytics, data collection, communication and orchestration.</p>
<h2 id="industrial-server-at-s1">Industrial server at S1</h2>
<p>For prototyping purposes, we recently bought an industrial server which is protected against dust and vibration. Therefore, we are now able to work on industrial scenarios which require more processing power or need to establish the edge concept.</p>
<p><img src="/assets/blog/industrial-server.png" alt="industrial-server" title="industrial server at S1" /></p>
<p>The industrial server has 16 GB RAM and an Intel iCore5 7200U which is sufficient power to run different servers or IoT edge software. Additionally, it has many possibilities to connect to different kinds of networks.</p>
<h2 id="quickly-tested">Quickly tested</h2>
<p>As an initial test we installed CentOS (Linux) as operating system and the log management software Graylog.</p>
<p><img src="/assets/blog/graylog.png" alt="graylog" title="Screenshot of Graylog" /></p>
<p>Graylog is a great tool to search for log entries from different and distributed applications. Moreover, it has a great visualization and alert functionality. Currently, a raspberry pi (single-board computer, IoT device) sends log messages with the current humidity and temperature to our industrial server. Graylog aggregates, stores and visualizes the data so that we have an overview of historic temperatures. Furthermore, we are able to define notifications and alerts if specific thresholds are reached.</p>Dominik Bial(republished on linkedin)The power and necessity of log management solutions2019-06-30T00:00:00+02:002019-06-30T00:00:00+02:00/2019/06/log-management<p><strong>(republished on <a href="https://www.linkedin.com/post/edit/power-necessity-log-management-solutions-dominik-bial">linkedin</a>)</strong></p>
<p>“Logs” is a general term in software development which describes the recording of status updates or other events of an application, for example from a server or operating system. Logs contain information at a central location for developers and administrators to understand the software’s behavior. They are used to solve issues. A common way to handle logs is to attach entries to the end of a file so that it can be viewed via text editing tools.</p>
<h2 id="about-the-necessity-of-log-management-in-distributed-environments">About the necessity of log management in distributed environments</h2>
<p>Application landscapes are getting more complex every day. This is mainly due to the increasing distribution of software. Just imagine a mobile app that requests data from a server; the simplest setup already results in two different log files. Therefore, a developer needs to check at least two different locations for maintenance. If you imagine IoT and Industry 4.0 scenarios where many different systems and parties are involved the usefulness of log management systems gets clear. It helps to centralize and relate log entries from different applications.</p>
<p>To simplify daily work the concept of log management was developed. It makes use of a central system where all recorded logs are stored. With the help of a query language, log transformation features, dashboards and alerts developers are able to handle logs more efficiently. For example: if you experience some strange network behavior at 08:43 you are able to define a search query that displays recorded logs between 08:40 and 08:50. Additionally, you can filter the logs so that only network issues are displayed.<br />
With the right logging strategy applied to your applications operations can be simplified drastically. Relations among different applications can be made transparent, impacts of a fault on the application landscape can be made visible and processes or workflows are traceable.</p>
<p>The traceability is one of my favorites. If you have a specific identifier, for example an order number of your online shop, you are able to track and trace the order process. Just log the order number as part of each log entry. Changes made in the shopping system can be related to changes in the billing system. You just need to search for the order number and all log entries related to it are displayed.</p>
<h2 id="added-value-for-non-technicians">Added value for non-technicians</h2>
<p>Typically, logs do not only contain technical data. Developers also record status changes like a successful order. Therefore, log management and its additional features like dashboards might be interesting for departments, too. This means, that log management provides mechanisms to make business processes more transparent and to provide an initial solution for real-time dashboards. It might be a great tool to gather first insights and learnings.
In IoT and Industry 4.0 scenarios you have many different sensors, actuators and systems which generate data. Log management software can be a great tool to gather these data and to provide a frontend for users to browse and analyze data easily before specific frontends are implemented. Therefore, log management helps to understand data and situations within your processes before your applications are changed or new software is developed.</p>
<p><img src="/assets/blog/log_mgmt_concept.png" alt="concept" title="log management concept" /></p>
<p>Let us have a look at a simple demonstration we developed at Schacht One.</p>
<h2 id="example-our-graylog-at-schacht-one">Example: our Graylog at Schacht One</h2>
<p><img src="/assets/blog/log_mgmt_pi.png" alt="raspberry" title="raspberry pi" /></p>
<p>We used a raspberry pi (a single board computer) and attached a temperature sensor to record temperature and humidity every 30s. Both values are logged and send to our industrial server which runs Graylog as log management software.</p>
<p><img src="/assets/blog/industrial-server.png" alt="industrial-server" title="industrial server at S1" /></p>
<p>Graylog which is the leading log management software stores and prepares the data in a way that users can search for log entries of the last three months. Of course, this interval can be changed. In the image below a query was executed that searches for the term “temperature” in log messages which are not older than 5 minutes.</p>
<p><img src="/assets/blog/log_mgmt_query.png" alt="search-query" title="Graylog search functionality" /></p>
<p>Based on the temperature and humidity data from the raspberry pi we configured two widgets for Graylog’s dashboard feature which visualizes data of the last 24 hours in two diagrams. The image shows the humidity and temperature values around the 26/06/2019 of June.</p>
<p><img src="/assets/blog/log_mgmt_dashboard.png" alt="dashboard" title="Graylog based temperature and humidity dashboard" /></p>
<h2 id="alerts">Alerts</h2>
<p>As we have absolute values of the temperature and humidity, we are able to define simple alerts. We just defined a threshold for the temperature so that an alert is fired when the temperature is higher than 30°C.</p>
<p><img src="/assets/blog/log_mgmt_alarm.png" alt="alerts" title="Graylog alert feature" /></p>
<h2 id="a-short-summary">A short summary</h2>
<p>Besides the traditional use of logs to monitor applications’ states and behavior logs also provide business value. Log management can help to make it available to non-technicians. Moreover, with the help of dashboards and alerts data can be visualized and people can be made aware of interesting situations.
Additionally, log management is a great tool to get a deeper insight of your systems and processes. It helps to understand your data before application development is started.</p>
<h2 id="appendix-a-selection-of-log-management-software">Appendix: A selection of log management software</h2>
<p>There is a huge variety of log management software on the market. My favorite ones are:</p>
<p><strong><a href="https://www.graylog.org/">Graylog</a></strong>
The leading log management software on the market. It is easy to install and simple to use.</p>
<p><strong><a href="https://www.elastic.co/elk-stack">ELK</a></strong>
ELK stands for Elastic search (basically a search engine), Kibana (dashboard technology) and Logstash (data processing pipeline for logs). Great tools which can be combined to a log management environment. Using this technology stack, you are able to adapt the software to your specific needs. However, compared to Graylog it is more difficult to install and maintain.</p>
<p><strong><a href="https://www.splunk.com">Splunk</a></strong>
Splunk is a great tool and can be run in cloud and on premise. Besides classical log management further features for machine learning and predictive maintenance are provided. It is an extremely interesting piece of software for IoT scenarios. However, it is not for free, but they have a free version for testing. It is called Splunk Free.</p>
<p><strong><a href="https://www.signalfx.com">SignalFx</a></strong>
Last but not least, there is SignalFx which has great dashboards and is especially designed for integrating with the big cloud providers. They use stream analytics as core feature to create insights.</p>Dominik Bial(republished on linkedin)Visiting the DigitalXChange 2019 – My key takeaways2019-05-25T00:00:00+02:002019-05-25T00:00:00+02:00/2019/05/digitalxchange<p><strong>(republished on <a href="https://www.linkedin.com/pulse/visiting-digitalxchange-2019-my-key-takeaways-dominik-bial/">linkedin</a>)</strong></p>
<p>This was an extremely exciting but also an exhausting week – full of new ideas, new people I was allowed to meet and great discussions.</p>
<p>The final highlight of the week was my participation at this year’s <a href="https://digital-xchange.de/">DigitalXChange</a> which is a conference for developers, UX experts and digital leads. More than 1000 people visited the conference. Speakers were divided into 16 tracks so that it was quite hard to decide which talk to attend. Topics ranged from design thinking, agile/lean movement, user experience to software architecture, devOps and software development.</p>
<h2 id="my-three-key-takeaways">My three key takeaways</h2>
<h3 id="schnellboote-im-trockendock-ein-blick-unter-die-haube-des-opensource-projekts-opendevstack-by-torsten-jaeschke">“Schnellboote im Trockendock: Ein Blick unter die Haube des Opensource Projekts OpenDevStack“ by Torsten Jaeschke</h3>
<p>The <a href="http://www.opendevstack.org">opendevstack</a> is mainly a provisioning tool that sets up a complete development environment within minutes. This environment contains Jira, confluence, git. Furthermore, templates are included to setup up, for example, Angular or Spring Boot projects so that you can start with the development of an MVP really fast. Opendevstack is based on openshift. The idea behind the project is to separate innovation projects from compliance but to provide the ability to migrate projects back to the organisation easily. I really liked the combination of technologies and will definitely keep this open source project in mind for future prototypes.</p>
<h3 id="evolution-of-api-driven-architectures-by-sven-bernhardt">“Evolution of API driven Architectures” by Sven Bernhardt</h3>
<p>Add descriptionNo alt text provided for this image
A really great talk which introduced the evolution of API gateways and API management platforms as well as different software development processes and architecture approaches. Especially, I liked the lightweight approach with Kong which, again, is interesting for prototyping. Kong can be used easily as the whole gateway is encapsulated within a docker container. It can be run everywhere and can be easily migrated.</p>
<h3 id="smart-asset-management-and-predictive-maintenance-with-iiot-by-robert-van-molken">“Smart asset management and predictive maintenance with IIoT“ by Robert van Molken</h3>
<p>Add descriptionNo alt text provided for this image
This was an inspiring talk! Robert introduced an Azure based IoT architecture within an industry 4.0 scenario. The talk was really helpful as it validated approaches we took within our own projects. Using Azure IoT submodules for local integration and forwarding data for processing within the cloud leads to a flexible and maintainable architecture concept – thanks for that.</p>
<p><strong>I really enjoyed the conference - looking forward to the next years DigitalXChange.</strong></p>
<div class="jekyll-twitter-plugin"><blockquote class="twitter-tweet" data-width="220"><p lang="en" dir="ltr">Great Talk by <a href="https://twitter.com/lugus1980?ref_src=twsrc%5Etfw">@lugus1980</a> about the Open Dev Stack. Something I have to keep in mind for prototyping. <a href="https://twitter.com/lugus1980?ref_src=twsrc%5Etfw">@lugus1980</a> thanks a lot for the insights. <a href="https://twitter.com/hashtag/digitalxchange2019?src=hash&ref_src=twsrc%5Etfw">#digitalxchange2019</a> <a href="https://t.co/HOIkOpm2r5">pic.twitter.com/HOIkOpm2r5</a></p>— Dominik Bial (@BialDominik) <a href="https://twitter.com/BialDominik/status/1132193615441289216?ref_src=twsrc%5Etfw">May 25, 2019</a></blockquote>
<script async="" src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
</div>
<div class="jekyll-twitter-plugin"><blockquote class="twitter-tweet" data-width="220"><p lang="en" dir="ltr">Always a pleasure to listen to your talks <a href="https://twitter.com/sbernhardt?ref_src=twsrc%5Etfw">@sbernhardt</a> - Evolution of API driven Architectures <a href="https://t.co/euUxVgM7Fn">pic.twitter.com/euUxVgM7Fn</a></p>— Dominik Bial (@BialDominik) <a href="https://twitter.com/BialDominik/status/1132265219416305664?ref_src=twsrc%5Etfw">May 25, 2019</a></blockquote>
<script async="" src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
</div>
<div class="jekyll-twitter-plugin"><blockquote class="twitter-tweet" data-width="220"><p lang="en" dir="ltr">Smart asset management and predictive maintenance with IIoT by an old colleague from AMIS. Nice to meet you again <a href="https://twitter.com/robertvanmolken?ref_src=twsrc%5Etfw">@robertvanmolken</a> <a href="https://t.co/W4xL672B0A">pic.twitter.com/W4xL672B0A</a></p>— Dominik Bial (@BialDominik) <a href="https://twitter.com/BialDominik/status/1132219429155221507?ref_src=twsrc%5Etfw">May 25, 2019</a></blockquote>
<script async="" src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
</div>Dominik Bial(republished on linkedin)Prototyping as a first step to digitization!2019-04-01T00:00:00+02:002019-04-01T00:00:00+02:00/2019/04/bd-iot<p>I published another article on linkedin for SchachtOne which is about prototyping for the digitization. We developed a prototype for our division BekaertDeslee that gathers data from the machines in production and stores those in Microsoft Azure. There, data is processed and used for further analysis.
The article is a short journey showing how the prototype was developed and explains how prototyping helps to evaluate new ideas.</p>
<p><a href="https://www.linkedin.com/post/edit/prototyping-first-step-digitization-dominik-bial"><img src="/assets/blog/prototyping-digitization.png" alt="impressions-prototyping" title="prototyping-digitization" /></a></p>Dominik BialI published another article on linkedin for SchachtOne which is about prototyping for the digitization. We developed a prototype for our division BekaertDeslee that gathers data from the machines in production and stores those in Microsoft Azure. There, data is processed and used for further analysis. The article is a short journey showing how the prototype was developed and explains how prototyping helps to evaluate new ideas.Started writing on linkedin - first article is about prototyping2019-02-16T00:00:00+01:002019-02-16T00:00:00+01:00/2019/02/linkedin-posting<p>Since beginning of this year I started to work as an architect for emerging technologies at <a href="https://schacht.one/">SchachtOne</a> where I am responsible for solution architectures for new products or digital business models, prototyping and trend scouting of new technologies.
I also publish articles for SchachtOne on linkedin, which I embed in this blog. The first article I published is about a our new prototyping area in our offices.</p>
<p><a href="https://www.linkedin.com/pulse/impressions-from-our-s1-prototyping-corner-dominik-bial/"><img src="/assets/blog/Impressions-prototyping-corner.png" alt="impressions-prototyping" title="impressions from our prototyping corner at S1" /></a></p>Dominik BialSince beginning of this year I started to work as an architect for emerging technologies at SchachtOne where I am responsible for solution architectures for new products or digital business models, prototyping and trend scouting of new technologies. I also publish articles for SchachtOne on linkedin, which I embed in this blog. The first article I published is about a our new prototyping area in our offices.Digitization and the importance of IT architecture2019-01-30T00:00:00+01:002019-01-30T00:00:00+01:00/2019/01/architecture-importance<p>Did you already stumble over IT issues while running new digital business models and products? Especially, small to medium enterprises struggle to set up IT structures to realize the full potential of their digital solutions.</p>
<p>Reasons are manifold, for example, technical debts or the neglect of data flows.</p>
<p>Digital products and business models are often based on IT trends like Big Data, Artificial Intelligence or the Internet of Things. Nevertheless, classical systems like CRM, ERP, MES are not less important. More often they are the foundation providing access to central data and workflows.</p>
<p>Let us have a look at the following scenario:
The company “Pervasive Automation” builds and sets up ticket vending machines for the public transport sector. Their machines can be found in various areas. Typically, the machines are placed at stops for busses, trams and trains but they can also be found in pedestrian precincts or shopping centers. Basically, everywhere where people might buy tickets.
“Pervasive Automation” does not run public transports on their own but acts as a supplier. So far, the company mainly sold the vending machines besides a few contracts for maintenance.
During the last months “Pervasive Automation” realized that they might have a unique selling point while providing a service-based business model for their machines. As the machines are already supplied with electricity, connecting the machines to the Internet of Things is just the next, logical step.
One main goal of this approach is to monitor the machines remotely and to gather data to improve maintenance. Instead of selling the machines, “Pervasive Automation” would like to ask for a monthly fee and takes over risks and losses of not-working ticket vending machines.</p>
<p>Besides connecting the ticket vendor machines, information from the CRM and ERP systems are required to keep the business model running. The CRM contains information about contact persons while the ERP system provides information about the vending machines’ history, version and production data.</p>
<p><img src="/assets/blog/architecture-importance.png" alt="architecture example" title="machines, machine data and their relations to classical systems" /></p>
<p>Working on digital products typically addresses more than the product itself. A broader scope is required and further dependencies need to be identified. This is the area of work of IT architects. Currently, there are mainly three main types of architecture roles:</p>
<p>A Software Architect has a focus on software systems and the fundamental, technical decisions which need to be made.</p>
<p>A Solution Architect has also a strong, technical background and focusses on software systems. However, he also has a strategic view aligning business decisions, the influence on software systems and shows ways how software systems can support business decisions.</p>
<p>An Enterprise Architect has a strategic view on the companies’ IT architecture guiding organizations through business, information and technology changes.</p>
<p>However, while the focus is different, the attitude is similar. Architects help to identify risks and missing links, keep the quality up or improve solutions regarding requirements and customer needs. They solve problems and try to define strategies, rules and approaches.</p>
<p>Architecture and the value architects generate during their work can be crucial for digitization. Architecture builds the foundation for new products and business models while architects help to define approaches to come to a fitting architecture.</p>Dominik BialDid you already stumble over IT issues while running new digital business models and products? Especially, small to medium enterprises struggle to set up IT structures to realize the full potential of their digital solutions.The Power of Events - IoT, Big Data and Machine Learning used in Combination2018-11-15T00:00:00+01:002018-11-15T00:00:00+01:00/2018/11/the-power-of-events<p>The book “The Power of Events: An Introduction to Complex Event Processing in Distributed Enterprise Systems” by David Luckham is a great start to understand concepts about events, event processing and event driven architectures. I read the book during my studies. The lessons are becoming more and more important.</p>
<p>Basically, an event is a piece of data that represents some change, which can be in the real or the digital world.</p>
<p>IoT is about recognizing events in the real world, which allows triggering process or let us gain deeper insights into our businesses. IoT devices can be controlled remotely. Moreover, they can interact with our environment which results in a fully different understanding of automation. With the help of Big Data, events can be stored for later analysis. Machine Learning provides functionality to detect patterns in data. Additionally, Machine Learning algorithms can be trained with Big Data to detect patterns in event streams.</p>
<p><img src="/assets/blog/digital-integration-hub.png" alt="digital integration hub" title="digital integration hub" /></p>
<p>In <a href="/2018/05/integration-as-the-foundation-for-digitization.html">integration as the foundation for digitization</a> I introduced the digital integration hub, which is also shown above.</p>
<p>Let us have a look at the following scenario, which addresses public transportation via trams.</p>
<p>A mobile app in the scenario can be aligned to the lower part of the image, where a backend for frontend is used, to optimize and store data for a mobile application. The backend for frontend exposes data via an API that is secured by an API management layer. The API is published to be used by our mobile app.</p>
<p>Tracking of trams is achieved via an IoT Cloud, which can be understood as a pre-integration stage. The trams send their position regularly as well as specific events like arrivals or departures; but also the usage of an emergency brake. Via Fast Data streaming technology users of the mobile app are informed of variances from the time table. Additionally, Machine Learning is used to predict delays with a high probability.</p>
<p>Timetables and trams are related via a planing software. Moreover, trams are managed via an enterprise resource planing tool; based on the events containing information about covered distances and usage time maintenance cycles are optimized.</p>
<p>At least, all communicated data in form of events is stored in the historization block. No event is lost. A first thought might be that this data is great for timetable optimization, which is surely true. Moreover, your could think about playback scenarios to clarify accidents or even use this data for the training of tram drivers.</p>
<p>To sum up:</p>
<ul>
<li>Only the combination of IoT and events, Machine Learning and Big Data principles unfold the full potential of digitization scenarios. One technology for its own provides benefits, but the combination is truly powerful.</li>
<li>Digitization is about automation. Event recognition and backing data allow to start automated processes. This automated processes make it possible to create new user experiences - like informing people about delays.</li>
<li>Mobile phones allow human beings to access and interact with the digital world. They are links to digital representations of our world.</li>
</ul>Dominik BialThe book “The Power of Events: An Introduction to Complex Event Processing in Distributed Enterprise Systems” by David Luckham is a great start to understand concepts about events, event processing and event driven architectures. I read the book during my studies. The lessons are becoming more and more important.Short note - Jekyll rulez!!2018-10-01T00:00:00+02:002018-10-01T00:00:00+02:00/2018/10/jekyll-rulez<p>I finally reworked the whole website. I decided to use Jekyll so that I am able to write posts in a markdown language instead of using a WYSIWYG editor - for example Wordpress. So far, it worked out pretty well.</p>
<p>Even though Jekyll is optimized to be used with github pages, there are plenty of plugins to build individual website deployment approaches. You can run, like me, Jekyll based pages on your own webspace. I am convinced by my current setting and hope, that I can use it for a long time, so that I can concentrate on writing and not bug fixing or updating infrastructure and software.</p>
<p>So - enjoy the new design;-) I will enjoy writing.</p>Dominik BialI finally reworked the whole website. I decided to use Jekyll so that I am able to write posts in a markdown language instead of using a WYSIWYG editor - for example Wordpress. So far, it worked out pretty well.Digital Exchange Rheinland2018-06-23T00:00:00+02:002018-06-23T00:00:00+02:00/2018/06/digital-exchange-rheinland<p>As described in <a href="%First inhouse Conference @ OPITZ Consulting%">First inhouse Conference @ OPITZ Consulting</a> OPITZ Consulting ran its first inhouse conference 2017 in Gummersbach at the TH Cologne. The conference was a huge success so that it is not very surprising that the conference took place 2018, too, called <a href="https://digital-xchange.de/">digital exchange</a>. Moreover, it got even bigger. The conference was opened for everybody who is interested in digitization and it got also a public call for proposal. The minister of innovation of north Rhine Westfalia, Andreas Pinkwart, even gave a keynote.</p>
<p>I was allowed to give three talks:</p>
<ul>
<li>Darf es ein bisschen mehr sein? Privacy by Design und Geolokalisierung</li>
<li>Integration reloaded - Integrationslösungen auf Basis reaktiver Prinzipien</li>
<li>Ist BPM tot? Die Rolle von BPM in Microservice Architekturen</li>
</ul>
<p>The slides of the talks can be downloaded from the conference’s <a href="https://digital-xchange.de/download/">webpage</a>.</p>
<div class="jekyll-twitter-plugin"><blockquote class="twitter-tweet"><p lang="de" dir="ltr">Privacy by Design im Projektkontext. Jetzt in 3103. Mit <a href="https://twitter.com/lugus1980?ref_src=twsrc%5Etfw">@lugus1980</a> <a href="https://twitter.com/OC_WIRE?ref_src=twsrc%5Etfw">@OC_WIRE</a> <a href="https://twitter.com/hashtag/digitalxchange?src=hash&ref_src=twsrc%5Etfw">#digitalxchange</a> <a href="https://t.co/8bk7qfPulY">pic.twitter.com/8bk7qfPulY</a></p>— Dominik Bial (@BialDominik) <a href="https://twitter.com/BialDominik/status/1010510354617700352?ref_src=twsrc%5Etfw">June 23, 2018</a></blockquote>
<script async="" src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
</div>
<div class="jekyll-twitter-plugin"><blockquote class="twitter-tweet"><p lang="de" dir="ltr">Gleich geht's los. Raum 3103 um 10.30 Uhr mit <a href="https://twitter.com/sbernhardt?ref_src=twsrc%5Etfw">@sbernhardt</a> <a href="https://twitter.com/OC_WIRE?ref_src=twsrc%5Etfw">@OC_WIRE</a> <a href="https://twitter.com/hashtag/digitalxchange?src=hash&ref_src=twsrc%5Etfw">#digitalxchange</a> <br /><br />Freuen uns auf spannende Diskussionen und viele Zuhörer. <a href="https://t.co/gXz09QFndc">pic.twitter.com/gXz09QFndc</a></p>— Dominik Bial (@BialDominik) <a href="https://twitter.com/BialDominik/status/1010435867205304320?ref_src=twsrc%5Etfw">June 23, 2018</a></blockquote>
<script async="" src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
</div>
<div class="jekyll-twitter-plugin"><blockquote class="twitter-tweet"><p lang="de" dir="ltr">Es füllt sich <a href="https://twitter.com/hashtag/digitalxchange?src=hash&ref_src=twsrc%5Etfw">#digitalxchange</a> <a href="https://twitter.com/OC_WIRE?ref_src=twsrc%5Etfw">@OC_WIRE</a> <a href="https://t.co/BAMxOLS43u">pic.twitter.com/BAMxOLS43u</a></p>— Marco Buss (@marcobuss) <a href="https://twitter.com/marcobuss/status/1010417284094021637?ref_src=twsrc%5Etfw">June 23, 2018</a></blockquote>
<script async="" src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
</div>Dominik BialAs described in First inhouse Conference @ OPITZ Consulting OPITZ Consulting ran its first inhouse conference 2017 in Gummersbach at the TH Cologne. The conference was a huge success so that it is not very surprising that the conference took place 2018, too, called digital exchange. Moreover, it got even bigger. The conference was opened for everybody who is interested in digitization and it got also a public call for proposal. The minister of innovation of north Rhine Westfalia, Andreas Pinkwart, even gave a keynote. I was allowed to give three talks: Darf es ein bisschen mehr sein? Privacy by Design und Geolokalisierung Integration reloaded - Integrationslösungen auf Basis reaktiver Prinzipien Ist BPM tot? Die Rolle von BPM in Microservice Architekturen The slides of the talks can be downloaded from the conference’s webpage. Privacy by Design im Projektkontext. Jetzt in 3103. Mit @lugus1980 @OC_WIRE #digitalxchange pic.twitter.com/8bk7qfPulY— Dominik Bial (@BialDominik) June 23, 2018 Gleich geht's los. Raum 3103 um 10.30 Uhr mit @sbernhardt @OC_WIRE #digitalxchange Freuen uns auf spannende Diskussionen und viele Zuhörer. pic.twitter.com/gXz09QFndc— Dominik Bial (@BialDominik) June 23, 2018 Es füllt sich #digitalxchange @OC_WIRE pic.twitter.com/BAMxOLS43u— Marco Buss (@marcobuss) June 23, 2018