Skip navigation
All Places > GeoDev Germany > Blog > 2017 > December
2017

Let's start with the result:

The heartbeat of BerlinOne of the major questions of a location is: What/Whome can I reach in [x] minutes? Therefore ArcGIS offers the Network Analyst with its tool "Service Areas". We can analyse the area reachable by car (other modes are possible) in a given time period. By using several input points we can also analyze patterns in a whole area.

Using a normal network would not alter the service areas during the day but by using the ArcGIS Online network dataset based on HERE data we can also use the traffi patterns for a day of the week and a given time. Traffi patterns can be analyswed in a 15min period. Analyzing the service areas during a day provides a valuable insight into the traffic situation of a region: big differences indicate first of all higher traffic volume on streets. The total area also indicates the quality of the network: large areas indicate a better accessibility using the car (maybe caused by a dense network of high-speed streets).

Automated Analysis

As I love geoprocessing, I was interested in the automation of the analysis and the visualisation as an animation. The process is therfore a two-stepped one.

First I created a simple point feature class called "StartPoints" which holds the locations of the "facilities" and added them to the Service Area Analysis Layer:

#create Analysis Layer
arcpy.na.MakeServiceAreaAnalysisLayer("https://www.arcgis.com/", "BeatCity Service Layer", "Driving Time", "FROM_FACILITIES", "5;10;15;20;25;30", "05.01.1900 00:01:00", "LOCAL_TIME_AT_LOCATIONS", "POLYGONS", "STANDARD", "DISSOLVE", "RINGS", "100 Meters", None, None)
#add locations from a point feature class
arcpy.na.AddLocations("BeatCity Service Layer", "Facilities", "StartPoints", "Name Description #;CurbApproach # 0;Attr_Minutes # 0;Attr_TravelTime # 0;Attr_Miles # 0;Attr_Kilometers # 0;Attr_TimeAt1KPH # 0;Attr_WalkTime # 0;Attr_TruckMinutes # 0;Attr_TruckTravelTime # 0;Breaks_Minutes # #;Breaks_TravelTime # #;Breaks_Miles # #;Breaks_Kilometers # #;Breaks_TimeAt1KPH # #;Breaks_WalkTime # #;Breaks_TruckMinutes # #;Breaks_TruckTravelTime # #", "5000 Meters", None, None, "MATCH_TO_CLOSEST", "APPEND", "SNAP", "5 Meters", "EXCLUDE", None)

Now we need to alter the times automatically to refelct the times of day. The start time is '05.01.1900 00:00:00' which is a Friday and the time is midnight.

We can access the properties quite easily. As I am working with a 3D scene this is my only map object in my ArcGIS Pro project:

#get the project
doc = arcpy.mp.ArcGISProject('current')
#get the map
map = doc.listMaps()[0]
#get the service area layer
sa_layer = map.listLayers("BeatCity Service Layer")[0]
#Get the solver properties object from the service area layer
solver_props = arcpy.na.GetSolverProperties(sa_layer)

As we now have access to the properties we can easily iterate and solve the service layer. The result will be exported as a 3D feature class:

import os
for hour in range(0,24):
     #create the datetime object
     date = datetime.datetime(1900, 01, 05 , hour,0,0)
     #we will use the string-version later on
     datestring = date.strftime("%d.%m.%Y %H:%M")
     #set the time of the day and the date:
     solver_props.timeOfDay=date
     #solve the network layer
     arcpy.na.Solve("BeatCity Service Layer", "SKIP", "TERMINATE", None, None)
     arcpy.AddMessage("copying hour" + str(hour))
     #as we would like to visualize this in 3D
     arcpy.ddd.FeatureTo3DByAttribute(r"BeatCity Service Layer\Polygons", arcpy.env.workspace + os.sep + "Friday_" + str(hour), "ToBreak", "ToBreak")
     #and add a field with the date for later usage if wanted:
     arcpy.AddField_management(arcpy.env.workspace + os.sep + "Friday_" + str(hour), "datetime","Date","", "",8, "datetime", "NULLABLE", "REQUIRED")
     arcpy.management.CalculateField(arcpy.env.workspace + os.sep + "Friday_" + str(hour), "datetime", '"' + datestring + '"', "PYTHON_9.3")

Warning: Solving the layer consumes credits!

The results are stored in seperate feature classes which is not optimal yet this is my solution at the very moment. For

Visualization and Export

As we have seperate layers now I am applying the same style to all of them and triggering the visibility of each layer and exporting the layout which needs to be created prior the export!

My Layout looks something like this:

layout with tilted view and extrusioned layer

I added a text symbol with a default Text "Friday" in the layout to export the timestamp in the png result as well. Furthermore I added a marker symbol with the cities name to the layout and tilted the view a bit to increase the depth effect for the extruded layer. But first we need to disable all layers for our automated export with a quite unsophisticated approach:

#deselcting all layers:
for layer in map_obj.listLayers():
     if layer .name[0] == "F":
          layer .visible = False

Now we can iterate through the layers, apply the unique value renderer, alter the symbols to get polygons without borders, set the extrusion, alter the text element to get the right timestamp and export the layout as png:

for hour in range(0,24):
     poly_layer = map_obj.listLayers("Friday_" + str(hour))[0]
     poly_layer.visible = True
     symbology = poly_layer.symbology
     symbology.updateRenderer('UniqueValueRenderer')
     symbology.renderer.fields = ['ToBreak']
     symbology.renderer.colorRamp = doc.listColorRamps('Green-Blue (6 Classes)')[0]
     #alter all symbols:
     for sym in symbology.renderer.groups[0].items:
          sym.symbol.outlineColor = {'RGB': [255, 255, 255, 0]}
          sym.symbol.size = 0.0
     poly_layer.symbology = symbology
     #extrude:
     poly_layer.extrusion('BASE_HEIGHT', "3000-100*[ToBreak]")
     #export:
     datestring = "2017/12/08 " + str(hour) + ":00"
     for lyt in doc.listLayouts():
          for elm in lyt.listElements("TEXT_ELEMENT"):
               if elm.text[0] == "2":
                    elm.text = datestring
     lyt.exportToPNG(arcpy.env.workspace + os.sep + "friday_seattle_final" + str(hour) + ".png", 300)
     poly_layer.visible = False

Unfortunately the whole rendering and exporting process takes about 2.5minutes per scene on my Lenovo X270 with no dedicated GPU.

The result is a bunch of pngs that can be converted intzo a gif using several gif-creation  sites on the web or the open source image editor GIMP.

The result looks like this:

 

You can also append all polygon features from the result, add the datetime attribute, fill the used time, time-enable the feature class, and use the animation options in ArcGIS Pro to export the whole animation as a GIF:

The result looks quite similar to the file-wise export. Yet you don't have all the possibilities of the layout view. But still the result is quite good (added 5, 10, 15, 20 mil buffers):

Als Weihnachtsüberraschung zum Ende des Jahres 2017 wurde das neue Major Release 100.2 von ArcGIS Runtime veröffentlicht. Ziel dieses wichtigen Releases ist das Verringern die funktionalen Lücken zu den ArcGIS Runtime 10.2.x Versionen und auch zu ArcGIS Engine. Neben Bugfixes und Verbesserungen im 2D und 3D Rendering wurden Datumstransformation, mobile Workflows und viele neue Funktionen implementiert.

 

Einige Highlights sind:


• Neuer WMS Layer für WMS 1.3 und ENC Layer für das S-57 Format



      ENC Layer

 

• Support für Shapefiles und OGC Geopackages
• Neue 3D Scene Analysis API für z.B. Viewshed und Line of Sight



      3D Scene - Viewshed

 

• Datumstransformation, auch benutzerdefinierte Transformationen



      Datumstransformation

 

• Exportieren von Vector Tile Packages aus Vector Tile Map Services gehostet in ArcGIS Online oder ArcGIS Enterprise
• Verbessertes Editieren
• Preplanned Workflow für mobile Szenarien
• Time-aware Layers



      Time-aware Layer

 

  • uvm.

 

Ausführliche Informationen zu allen Neuerungen in ArcGIS Runtime 100.2  findet Ihr in diesem News Blog von Esri Inc., auf ArcGIS for Developers in den What's News der SDK's und auf GeoNet in den entsprechenden Bereichen.

DevLab: "Create a web map (2D)"

 

Für einen schnellen Einstieg in die ArcGIS Technologie bieten die ArcGIS DevLabs auf developers.arcgis.com eine sehr gute Möglichkeit für Entwickler und die, die es werden wollen. Sie dienen dazu, Interessierte innerhalb kleiner Übungen durch die drei Phasen der Erstellung raumbezogener Apps zu führen:

 

  • Beschaffung und Verarbeitung von Daten
  • Visualisierung und Gestaltung von Daten
  • Entwicklung eigener Apps mit entsprechender Funktionalität

 

Die Bearbeitungsdauer ist für jedes DevLab separat angegeben und beträgt zwischen 5 und 15 Minuten, wodurch es möglich ist, eine Übung ohne Unterbrechung zu absolvieren. Die Dauer kann je nach Kenntnisstand des Nutzers leicht abweichen, falls noch zusätzliche Software heruntergeladen und installiert werden muss.

 

Zu diesen Vorinstallationen gehören beispielsweise die für das Zielbetriebssystem passende ArcGIS Runtime SDK oder Postman, zur Abfrage der ArcGIS REST API. Keys und Values sind für die Abfrage bereits vorgegeben, um zu einem schnellen, interpretierbaren Ergebnis zu gelangen.

 

Aufgrund der großen Auswahl an DevLabs lässt sich das Angebot mittels verschiedener Filter auf die Wünsche des Nutzers konkretisieren. Ob nach einer bestimmten Thematik wie Routing oder Spatial Analysis, nach einem Produkt wie iOS oder Android oder direkt nach der REST API von esri gesucht wird, kann vorab ausgewählt werden.

 

Hier ein exemplarischer Request- und Response-JSON-Code für eine Abfrage an den ArcGIS World GeoEnrichment Service innerhalb des DevLabs "get demographic data".

 

f:json
token:GbGyIztsoLGLacjX_3W1EOF90c3HqTRZ_P9qah1At01tAqFaTI-MeWcQhEwAy7oHW5giMyhM8YGVfLnBN8xVMUFOF9kGOOSU84qopHw2Cb-rkWkc_-9hDcOW1fkn-1VH
inSR:4326
outSR:4326
returnGeometry:false
studyAreas:[  {    "geometry":{     "x":-118.09047,↵      "y":33.81091↵    }  }]
studyAreasOptions:{{$randomInt}}  "areaType":"RingBuffer",  "bufferUnits":"esriMiles",  "bufferRadii":[1]}
dataCollections:["KeyGlobalFacts"]
returnFields:false
{
    "results": [
        {
            "paramName": "GeoEnrichmentResult",
            "dataType": "GeoEnrichmentResult",
            "value": {
                "version": "2.0",
                "FeatureSet": [
                    {
                        "displayFieldName": "",
                        "fieldAliases": {
                            "ID": "ID",
                            "OBJECTID": "Object ID",
                            "sourceCountry": "sourceCountry",
                            "areaType": "areaType",
                            "bufferUnits": "bufferUnits",
                            "bufferUnitsAlias": "bufferUnitsAlias",
                            "bufferRadii": "bufferRadii",
                            "aggregationMethod": "aggregationMethod",
                            "HasData": "HasData",
                            "TOTPOP": "Total Population",
                            "TOTHH": "Total Households",
                            "AVGHHSZ": "Average Household Size",
                            "TOTMALES": "Male Population",
                            "TOTFEMALES": "Female Population"
                        },
                        "spatialReference": {
                            "wkid": 4326,
                            "latestWkid": 4326
                        },
                        "fields": [
                            {
                                "name": "TOTMALES",
                                "type": "esriFieldTypeDouble",
                                "alias": "Male Population",
                                "fullName": "KeyGlobalFacts.TOTMALES",
                                "component": "demographics",
                                "decimals": 0,
                                "units": "count"
                            },
                            {
                                "name": "TOTFEMALES",
                                "type": "esriFieldTypeDouble",
                                "alias": "Female Population",
                                "fullName": "KeyGlobalFacts.TOTFEMALES",
                                "component": "demographics",
                                "decimals": 0,
                                "units": "count"
                            }
                        ],
                        "features": [
                            {
                                "attributes": {
                                    "ID": "0",
                                    "OBJECTID": 1,
                                    "sourceCountry": "US",
                                    "areaType": "RingBuffer",
                                    "bufferUnits": "esriMiles",
                                    "bufferUnitsAlias": "Miles",
                                    "bufferRadii": 1,
                                    "aggregationMethod": "BlockApportionment:US.BlockGroups",
                                    "HasData": 1,
                                    "TOTPOP": 12483,
                                    "TOTHH": 4793,
                                    "AVGHHSZ": 2.6,
                                    "TOTMALES": 5998,
                                    "TOTFEMALES": 6485
                                }
                            }
                        ]
                    }
                ]
            }
        }
    ],
    "messages": []
}

 

Fazit: Die DevLabs sind eine abwechslungsreiche Gelegenheit, um sich mit der esri-Welt vertraut zu machen.

 

Viel Spaß beim Ausprobieren wünschen euchAndreas und Florian.

Das Beschaffen von Geodaten ist bei vielen Projekten eine zeitaufwendige Arbeit. Dies müssen nicht in jedem Fall hochgenaue Daten sein. Für manche Fragestellungen sind auch Input von Open Streetmap ausreichend. Sei es für Studentenprojekte, sich einen Überblick über eine Thematik zu verschaffen oder eine Zielgruppe mit Informationen zu versorgen. Es besteht die Möglichkeit diese Daten herunterzuladen, in ArcMap oder ArcGIS Pro zu importieren, die gewünschten Daten zu filtern und dann beispielsweise als Dienst zu veröffentlichen. Dieser Weg ist sehr zeitintensiv und daher bedarf es einer Lösung, um die Datenquelle Open Streetmap mit der ArcGIS Plattform direkt zu verbinden, ohne den Umweg über ein Desktopprodukt nehmen zu müssen.

 

 

Zur Umsetzung dieser Idee bietet sich die ArcGIS API for Python an, da diese neben diversen Analysetools auch einen vielseitigen Zugriff auf ArcGIS Online und ArcGIS for Portal anbietet. So wurden basierend auf dieser Technologie verschiedene Python Skripte erstellt und auf GitHub veröffentlicht. Ziel dieses Projekt ist es, dass Daten innerhalb eines festzulegenden Bereichs automatisch als Feature Service veröffentlicht werden.

 

Die Umsetzung erfolgte in zwei Python Skripten, die auch separat genutzt werden können. Bevor der Prozess mit den Daten erfolgen kann, muss der Nutzer jedoch zwei Konfigurationsdateien anpassen. Eine dieser Dateien legt die Konfiguration für die zu veröffentlichenden OSM Daten mittels eines JSON Files fest. Um das Skript starten zu können müssen verschiedene Parameter, wie sie in der Tabelle beschrieben sind, festgelegt werden.

 

 

Neben einer Liste von Kategorien, die in OSM als Paar von Key und Value dargestellt werden, können auch weitere Attribute von den Datensätzen übernommen werden. Zudem müssen eine Bounding Box, sowie die gewünschten Geometrien angegeben werden. Basierend auf diesem Input, wird ein Pythonskript ausgeführt, welches die gewünschten Daten mittels Requests an die Overpass und OSM API abruft.

 

response = api.Get('node[' + category + '](' + minLat + ',' + minLon + ',' 
                               + maxLat + ',' + maxLon + ')', responseformat="json")
            elements = response["elements"]
            for element in elements:
                dictElement = {}
                tags = [element["tags"]]
                tags = tags[0]
                for key_att in attributes:
                    val_att = attributes[key_att]
                    if val_att in tags:
                        dictElement[key_att] = tags[val_att]
                id = element["id"]
                #id = float(id)
                dictElement["id"] = id
                dictElement["lon"] = element["lon"]
                dictElement["lat"] = element["lat"]
                if "user" in attributes or "timestamp" in attributes:
                    try:
                        node = oApi.NodeGet(element["id"])
                        if "user" in attributes:
                            if "user" in node.keys():
                                dictElement["user_"] = node["user"]
                        if "timestamp" in attributes:
                            if "timestamp" in node.keys():
                                dictElement["timestamp"] = node["timestamp"]
                    except:
                        print("Node for this element not available")
                dictElement["attribute"] = key_cat + "-" + val_cat
                dictData.append(dictElement)

 

 Diese Daten werden nun in einem Dictionary strukturiert, in einen pandas Dataframe umgewandelt und zur weiteren Verwendung zurückgegeben.In einem zweiten JSON File werden die Konfigurationen für ArcGIS Online oder ArcGIS for Portal festgelegt.

 

 

Wenn die benötigten Daten vorhanden sind, werden diese auf Korrektheit überprüft. Dies bedeutet, dass eine Verbindung mittels der Login Informationen hergestellt wird. Dies ist nötig, um sicherzustellen, dass die Feature Service ID korrekt ist, wenn dieser überschrieben werden soll. Wenn der Input vollständig und korrekt ist, wird ein weiteres Pythonskript gestartet. Beim Updaten eines Feature Services wird dieser geleert, der Dataframe in Blöcke zerteilt und zu jeweils 100 Features in den Features Service hinzugefügt.

 

listAddFeatures = []
    i = 0
    dataAvailable = True
    dataUploaded = False
    dataQuery = fc_dataAdd.query()
   
    while dataAvailable:
        modulo_i = i % 100
        if modulo_i == 0 and i != 0 and not dataUploaded:
            layer.edit_features(adds = listAddFeatures)
            listAddFeatures.clear()
            dataUploaded = True
            print(str(i)+" Features of "+str(len(dataQuery))+" added.")
        else:
            try:
                listAddFeatures.append(dataQuery.features[i])
                i = i + 1
                dataUploaded = False
            except:
                dataAvailable = False
       
    layer.edit_features(adds = listAddFeatures)
    print("All "+str(len(dataQuery))+" Features added.")

 

Beim Erstellen eines neuen Feature Services fallen vor dem Upload der Daten weitere Arbeitsschritte an, denn es muss ein leerer Feature Service mit den benötigten Felder angelegt werden. Hierfür erstellt man einen neuen Dataframe, welcher nur die Titelzeile des Übergebenen enthält. Nun wird durch alle vorhandenen Felder iteriert und die Felder mit dem Datentyp „int64“ in eine neue Liste hinzugefügt.

 

newField = {
     "name" : intFieldName,
     "type" : "esriFieldTypeInteger",  
     "alias" : intFieldName,
     "sqlType" : "sqlTypeBigInt",
     "nullable" : True,
     "editable" : True,
     "visible" : True
     }
   
token_URL = "{}/sharing/generateToken".format(portal)
token_params = {'username' : user,
      'password' : password,
      'client' : 'referer',
      'referer': portal,
      'expiration': 60,
      'f' : 'json'
      }

r = requests.post(token_URL,token_params)
       
token_obj = r.json()
       
token = token_obj['token']
expires = token_obj['expires']
       
tokenExpires = datetime.datetime.fromtimestamp(int(expires)/1000)
       
featureLayerAdminUrl = layerURL.replace("/rest/", "/rest/admin/")
       
params = {"f":"json", "token":token}
params["addToDefinition"] = json.dumps({"fields":[newField]})
       
layerUpdateUrl = "{}/addToDefinition".format(featureLayerAdminUrl)
layerResult = requests.post(layerUpdateUrl, params)

 

Dieser Schritt ist wichtig, denn beim Veröffentlichen eines Feature Services mittels der ArcGIS API for Python werden Integer Felder nur als „int32“ angelegt und bei höheren Werten treten Fehler auf. Nun wird ein Feature Service erstellt und die „int64“ Felder mittels eines Requests an die Portal API hinzugefügt, denn auf diesem Weg können höhere Integerwerte gespeichert werden.

 

for field in listBigInt:
        del dataframe_total_title[field]
       
fc = gis.content.import_data(dataframe_total_title)
   
item_properties_input = {
    "title": title,
    "tags" : tags,
    "description": description,
    "text": json.dumps({"featureCollection": {"layers": [dict(fc.layer)]}}),
    "type": "Feature Collection",
}

item = gis.content.add(item_properties_input)
new_item = item.publish()

 

Nun ist der Service mit allen benötigten Attributen angelegt und die Features können in Blöcken, wie beim Update, hinzugefügt werden.

 

Ergebnis ist ein GitHub Repository, welches die verschiedenen Skripten enthält. Diese können somit in dem aktuellen Status oder auch einzeln verwendet werden. Zudem ist es möglich die Abläufe an die eigenen Bedürfnisse anzupassen. So könnte man das Skript auch um Methoden erweitern, um die Daten nicht als Feature Service, sondern als Shape File oder Geodatabase zu speichern.

 

Mit diesem Tool ist es gelungen eine Brücke zwischen der reichhaltigen Open Streetmap Datenbasis und der mächtigen ArcGIS Plattform zu schlagen. 

Hey fellow geo geeks.

 

Last week we have been hosting our last geodev meetup for this year. The topic this time was "The Power of Scripting" and it was all about automating your work, be it analysis or administrative talks. It turned out that December 6th (aka Santa Claus) was not a perfect timing, so many thanks to those of you who made it anyhow! And for those of you who couldn't make it, don't worry, we've got you covered with recordings.

 

Chenyu Zuo, TU München: Mining traffic information from social media data

Social media has become prevalent in the last decade. People publish various topics, for instance food, work, traveling, sports, health, on a variety of social media platforms on a daily basis, which reflect a wide range of socioeconomic phenomena and human behaviors. The rich social media data can also be used for traffic information extraction and analysing. For example, where and when do people complain about public transport services? How is traffic congestion related to traffic accidents? Which kinds of people activities are related to traffics. Ming such information is very interesting and challenging. In this talk, we hope to provide you solutions for events extraction from textual data and innovative ways of geo-data analysing.

 

 

 

Jan Wilkening, Esri Deutschland: R for GeoGeeks - The Interface between R and GIS

Interesting things happen at the interface of GIS and the popular scripting language R: GIS users can profit from the broad range of publicly available R scripts, while R users can profit from the power of user-friendly GIS. This talk shows GIS users what R is and how they can use it for their daily work. For those more familiar with R, the talk will also highlight how R users can enhance their R routines with GIS.

 

 

Alexander Erbe, Esri Deutschland:The ArcGIS Python API

The powerful ArcGIS API for Python lets you script and automate tasks in ArcGIS using a Pythonic library. With the Python API, you can streamline data analysis, content management and web GIS administration. This lecture will give you a brief overview of the possibilities based on some practical examples, e.g. the convert of OSM data using the OSM API to ArcGIS feature services.

 

 

 

Enjoy your christmas break! 

Alex and Lars

 

 

Geo DevelopersPythonGIS Developers Alexander Erbe Esri Deutschland-Schweiz Partner Community

Wherever many people come together, challenges arise. Both in outdoor areas, but especially within buildings, it can quickly "bump" somewhere. This is not uncommon in major events such as concerts or sporting events with several thousand visitors. But even with a few hundred people, the single toilet can be crowded before, or provide a lecture room not enough space for all participants.

 

Would not it be nice to be able to see live on a map of the building, how crowded a specific place is, if there is still space available, and whether it is worth taking the road at all?

And would not it be nice for the organizer to always have an up-to-date overview of the utilization of individual areas of the building during the event, and perhaps proactively avoid bottlenecks?

Would not it be great if you could use the results of a current event to be better prepared for follow-up events?

 

This is precisely the question we asked ourselves in advance of this year's Esri Developer Summit in Berlin .

Together with our startup partner, Square Metrics  , we have developed a demo scenario in which iBeacons are used to measure the occupancy of specific locations in the building by measuring when and how often a particular beacon is triggered by the approach of a participant becomes.

 

The (anonymous!) Information sent by each beacon was first sent to the Proximity DMP of Square Metrics, where the individual beacons were registered and organized. From there, every single event was sent in real-time to the ArcGIS backend, where a GeoEvent server received the information, evaluated it in real-time and saved it for later analysis. The following graphic shows an overview of the components used and the interaction between them.

 

 

The graph also makes it clear that ArcGIS Enterprise can be upgraded to a full-fledged IoT platform in conjunction with external device management .

 

This allowed us to show on a map how many times a particular sensor was triggered, both when entering a certain area and when leaving it. If you now place a beacon in a room, you can quickly calculate how many people went in and out at a certain time. Compared with the actual capacity, one can quickly create an indicator that indicates whether a particular room is overcrowded or whether there are still free seats available.

 

 

Hotspots and coldspots can be identified quickly and thus conclusions can be drawn as to which lectures are good and which are less well received by the audience.

 

 

Taking into account not only a single beacon, but all the sensors distributed in the building, you can even detect motion profiles, for example, whether or not there is a connection between certain places (and content presentations). For example, did a Javascript developer also attend .NET lectures, or was he in the cafeteria at those times?

 

With all this information, you can react to specific events during the event and then analyze the current event so that the following events are adapted to the behavior of the participants.

 

No more congestion in front of the toilet, always a free seat in the lecture - it does not have to remain a utopia: with iBeacons and ArcGIS as a real-time IoT platform!

Filter Blog

By date: By tag: