Today our phones resemble a genie’s lamp, a powerful wish-granting device at your fingertips! It can bring us anything and take us places in just a few clicks. But how did this happen, and what role does location play in this?
Let us consider a situation which has probably happened to all of us. Sometimes when we book a cab, especially during peak times, we are met with a message that says “we cannot find a ride, try again”. This is a classic case of a supply-demand gap. It could be because of the following reasons:
As you can see, for a company with moving assets, a lot of the insights and decisions have to be derived from the location. Companies like Uber, Amazon, and Airbnb, among others, realized the immense value of this data in their nascent stages and built their business with location as their foundation. Uber revolutionized the usage of maps and started understanding the flow of their cities to become the top ride-sharing player in the world. Similarly, Amazon has built one of the most resilient supply chains by tapping into the power of location data and the list goes on.
Read on to find out how we can perform geospatial analytics and make the most out of your location data!
In simple words, it is the information about the geographic positions of an entity (this could be a user, POI so on). It is also called geospatial data, geographic data, or geodata. They are often formatted as points (latitude-longitude coordinates), polygons, or polylines.
The differentiating feature of the location is that it incorporates the third dimension on top: values across time and dynamic location — and this aspect requires us to approach the data with a slightly unique approach. Remember, location is not just about individual points, but also about movement.
Because there are millions of moving assets from the delivery fleet, vehicles, users to drivers on the ground that need to be monitored and optimized at all times. For example, if we consider a city, no two areas behave the same way at a particular time. Some of them have more movement during the day and some during the night. We also have to consider the external factors like weather, accidents, traffic, and rallies which affect the behavior of areas without a forewarning.
Therefore, it is important for us to contextualize our decisions for different areas and times and practice location-based decisions. It is only through the lens of location intelligence we can unearth the patterns, relationships, and trends between the moving assets and answer questions like:
Churn analysis console: Identify where the user churn is the highest
Churn analysis console: Identify where the user churn is the highest
Robust location analytics platforms give you relevant real-time hyperlocal insights that can transform your day-to-day operations. You can:
Altogether, this will promote better asset utilization, reduce churn, and improve overall performance of businesses involving moving assets.
Locale.ai Console: Impact analysis
Now that the importance of location analytics has been established, let us understand how one can go about making decisions about your on-ground assets.
We can break this down into five essential steps.
Getting insights involves gathering data from different places. After that, the data has to be cleaned and checked for any outliers. Once that data is ready to be used, we can move on to creating relevant metrics.
Today the way it is done involves writing complex queries and performing data transformations on R and Python. Other commonly known software for powerful data processing and geospatial operations are PostgreSQL and PostGIS. While these tools offer the capabilities to process your data, creating complex metrics like idle time, unfulfilled demand, or deviations can be time-consuming and requires you to have an innate understanding of the problem.
This important step helps us unearth patterns and relationships between the moving assets. Essentially, it is about identifying the problematic areas and arriving at actionable insights. For example, an actionable insight could be that there is a high demand near colleges at 4 PM during the weekdays.
Today, this process is done by BI tools like Tableau, Looker, Power BI, and open source softwares like Kepler.gl. Though the BI tools are equipped with basic geospatial functionalities, it is no doubt that they are best suited only for statistical analysis. Features like analysis on routes, grids, layering, and so on are incompatible and you can only plot points and zip code via these tools. Consider Tableau, it has great heat maps, but we cannot equate visualization to analysis.
Coming to Kepler.gl, provides a great platform to explore everything related to maps. But it is useful only for one-time use cases, and as we process the data outside the tool, it cannot act as a catalog in the future. Thus making this process unsuitable for multiple iterations and situations that are specific to your industry.
Source: Locale.ai
As mentioned earlier, there needs to be a provision to make decisions either in real-time when you need to react or more strategically when you need to analyze patterns and trends.
For example, a promotion would have to be sent via Mixpanel. While a pricing problem would be solved via tweaks to a formula, a ground operations person would fix a driver issue. Finally, passing the data to implement the strategy requires multiple teams to come together to decide and act upon it. This process is painstaking and involves top-notch coordination. It can be particularly problematic if a decision has to be taken in real-time.
Analytics cannot be done in a silo. It needs to ensure that people across teams can come together to debug and implement decisions. Today, we can use Slack, Jira, and others to collaborate but note that they do not come with a knowledge base, and clearing the ticket on Jira can take longer than expected. When decisions need to be taken across different stakeholders, the context is often unclear and results may be delayed.
Once you take your decisions, you can measure the impact of your actions and iterate. It is the only way to know if your solution is working. This step is actually missing in today’s analytics process due to the lack of the right infrastructure. This not only hinders experimentation but also doesn’t provide a means to assess the strategy for different areas and times. For example, we can run an experiment by provisioning extra members of the delivery fleet in the western part of the city and see if delays are reducing or not on Mondays.
A robust analytics platform should be able to perform all these actions seamlessly. When done right, the teams will be able to drill down and have access to insights at their fingertips.
Locale.ai is working towards empowering Ops and city teams, in the same spirit of Mixpanel and Clevertap. It is designed to act as a central operations platform by ingesting location data across users, fleet and orders.
We enable the teams to build their consoles and metric templates, perform various analyses, create workflows, run experiments, measure performance and collaborate all in one platform. As a user, you can customize every single console or dashboard based on your use cases in just a few clicks without any engineering bandwidth. One thing all of us should internalize about geospatial analytics is that it does not stop at visualizations.
But doing this requires organizations to invest in strong data pipelining and management architectures for their geospatial data. Unfortunately, handling geospatial data is still a hard problem. Add scale, contextual visualization, and near real time latencies to the mix, and such analysis remains out of reach for most organizations. With Locale, we want to bridge this gap and envision a world where every company is able to tap into the power of location data with great ease.
Source: Locale.ai
If the world of geospatial analytics excites you and you want to delve deeper into location data, here are a few resources you can check out:
https://blog.locale.ai/how-were-building-our-geospatial-analytics-product-using-first-principles-2/
The media shown in this article are not owned by Analytics Vidhya and are used at the Author’s discretion.
So this article is more about advertisement of a product rather than giving insights on geospatial data. I wonder if AV can categorize such articles and let reader know in the start itself.