Dive Deep into Creativity: Discover, Share, Inspire
Map-in-progress I'm making for friends research of chemical levels in Texas coastal bays and estuaries.
Google Earth Pro is one of the most powerful freely available software one can use for location investigation. If you're a non-tech GIS user who needs to know just enough to get your work going, then let's hit the ground running with this tutorial for starters.
🟢 Beginner-friendly.
🆓 Free with no hidden monetary cost.
🖥️ Available for Windows, Mac and Linux.
Ok.
I wanna know why have I never heard of this online tool before. Like, what the hell is wrong with the social media? Is something wrong with Twitter or Instagram or something that they never caught on mapshaper? Or was it just me and my hazardous ignorance, yet again?
Have you tried this free nifty online tool that literally simplify crazy complicated shapefile polygons like it’s no one’s business?!
It started with some last minute inspiration on how to collate data from 3 different regions; developed from remote sensing techniques which vary from one another. The common output here is to turn all of them into a vector file; namely shapefile, and start working on the attribute to ease merging of the different shapefile layers.
Once merged, this shapefile is to be published as a hosted feature layer into the ArcGIS Online platform and incorporated into a webmap that serves as a reference data to configure/design a dashboard. What is a dashboard? It's basically an app template in ArcGIS Online that summarizes all the important information in your spatial data. It's a fun app to create, no coding skills required. Check out the gallery here for reference:
Operations Dashboard for ArcGIS Gallery
There are two common ways to publish hosted feature layer into ArcGIS Online platform.
Method 1: Zip up the shapefile and upload it as your content. This will trigger the command inquiring if you would like to publish it as a hosted feature layer. You click 'Yes' and give it a name and et voila! You have successfully publish a hosted feature layer.
Method 2: From an ArcGIS Desktop or ArcGIS Pro, you publish them as feature service (as ArcMap calls them) or web layer (as the its sister ArcGIS Pro calls them). Fill up the details and enabling the function then hit 'Publish' and it will be in the platform should there be no error or conflicting issues.
So, what was the deal with me and mapshaper?
🛑 A fair warning here and please read these bullet points very carefully:
I need you to remember...I absolve any responsibility of what happens to your data should you misinterpreted the steps I shared.
Please always 👏🏻 BACK 👏🏻 UP 👏🏻 YOUR 👏🏻 DATA. Don’t even try attempting any tools or procedure that I am sharing without doing so. Please. Cause I am an analyst too and hearing someone else forget to save their data or create a backup is enough to make me die a little inside.
For this tool, please export out the attribute table of your shapefile because this tool will CHANGE YOUR SHAPEFILE ATTRIBUTES.
When I was publishing the vector I have cleaned and feature-engineered via ArcGIS Pro...it took so long that I was literally dying inside. I'm not talking about 20 minutes or an hour. It took more than 12 hours and it did not conjure the 'Successfully published' notification as I would've expected from it.
So at around 5.30 am, I randomly type 'simplify shapefily online free'. Lo and behold, there was mapshaper.
All I did was, zip up my polygon, drag it to the homepage and it will bring you to the option of choosing the actions that will be executed while the data is being imported into mapshaper:
detect line intersections
snap vertices
This option will help you to detect the intersections of lines within your vector/shapefile. This can help identify topological error.
The option to snap vertices will snap together points of similar or almost identical coordinate system. But it does not work with TopoJSON formats.
There is something interesting about this options too; you can enter other types of customized options provided by the tool from its command line interface! But hold your horses peeps. I did not explore that because here, we want to fix an issue and we'll focus on that first. I checked both options and import them in.
This will bring the to a page where there you can start configuring options and method to simplify your vector.
To simplify your shapefile, you can have both options to prevent the shape of the polygon being compromised; prevent shape removal, and to utilize the planar Cartesian geometry instead of the usual geoid longitude and latitude; use planar geometry. The implication of the second option is not obvious to me yet since all I wanted was to get the data simplified for easy upload and clean topology, thus, I chose both options to maintain the shape and visibility of all my features despite the highest degree of simplification.
Alike to the options of methodology for simplication in the mainstream software, I can see familiar names:
Douglas-Peuker
Visvalingam / effective area
Visvalingam / weighted area
First and foremost, I had no slightest idea of what these were. Like for real. I used to just go first for the default to understand what sort of output it will bring me. But here, the default; Visvalingam / weighted area, seemed like the best option. What are these methodologies of simplification? There are just algorithms used to help simplify your vectors:
🎯 Douglas-Peucker algorithm decimates a curve composed of line segments to a similar curve with fewer points (Ramer-Douglas-Peucker algorithm, Wikipedia; 2021).
🎯 Visvalingam algorithm is a line simplication operator that works eliminating any less significant points of the line based on effective area concept. That basically means that the triangle formed by each of the line points with two of its immediate neighboring points (Visvalingam Algorithm | aplitop).
🎯 Visvalingam algorithm with weight area is another version of Visvalingam algorithm of subsequent development where an alternative metrics is used and weighted to take into account the shape (Visvalingam & Whelan, 2016).
For reasons I can't even explain, I configured my methodology to utilize the third option and now that I have the time to google it, Thank God I did.
Then, see and play with the magic at the 'Settings' slider where you can adjust and view the simplification made onto the vector! I adjusted it to 5%. The shape retained beautifully. And please bear in mind, this vector was converted from a raster. So, what I really wanted is the simplified version of the cleaned data and to have them uploaded.
Now that you've simplified it, export it into a zipped folder of shapefile and you can use it like any other shapefile after you extracted it.
Remember when I say you have got to export your table of attributes out before you use this tool? Yea...that's the thing. The attribute table will shock you cause it'll be empty. Literally. With only the OBJECTID left. Now, with that attribute table you've backed up, use the 'Join Table' tool in ArcGIS Pro or ArcMap and join the attribute table in without any issues.
Phewh!!
I know that it has alot more functions than this but hey, I'm just getting started. Have you ever done anything more rocket science than I did like 2 days ago, please share it with the rest of us. Cause I gotta say, this thing is cray!! Love it so much.
mapshaper developer, if you're seeing this, I 🤟🏻 you!
UPDATE
I have been asked about the confidentiality of the data. I think this is where you understand the reason behind the fact that they will work even with using just the ‘.shp’ file of the shapefile since _that_ is the vector portion of the shapefile.
Shapefile is a spatial data format that is actually made up of 4 files; minimum. Each of these files share the same name with different extensions; .prj, .shx, .shp and .dbf. Although I am not familiar with what .shx actually accounts for, the rest of them are pretty straightforward:
.prj: stores the projection information
.dbf: stores the tabulated attributes of each features in the vector file
.shp: stores the shape/vector information of the shapefile.
So, as the tool indicate, it actually helps with the vector aspect of your data which is crucial in cartography.
March 9, 2021, cloudeo AG, the global leader in the geospatial solutions marketplace, and its strategic partner Precision Landing GmbH today announced the acquisition of TerraLoupe GmbH, a Munich-based geo platform and HD mapping company. A geo platform that uses machine learning technology to detect objects from aerial imagery automatically creates a 3D digital twin of the world.
TerraLoupe supports its customers in Roads, Parking, Buildings, Pools, Vegetation, and Infrastructure using state-of-the-art deep learning algorithms and large amounts of data to analyze aerial imagery at scale.
Terraloupe was founded on the premise that understanding accurate geo image data could fundamentally improve businesses’ decision-making — introducing new prospects for numerous industrial applications. TerraLoupe acquired aerial images through its network of partners, analyzed them by their proprietary machine learning algorithms, and provide object recognition to locate every piece of relevant information for its customers.
“This acquisition strengthens our customer reach and increases alignment to our core strategic objectives with the geospatial solutions marketplace to our customers and partners into one standard operating platform globally. It adds key technical competencies such as machine learning to our platform,” said Dr. Manfred Krischke, CEO of Cloudeo AG
After the successful merger of TerraLoupe’s AI department for Autonomous Systems Safety in May 2020 with Edge Case Research GmbH, it is the second M&A for TerraLoupe with a glorious future.
“I am thrilled that TerraLoupe now has an impact in two different domains within such important emerging markets.” Manuela Rasthofer, founder TerraLoupe
TerraLoupe will continue to serve its customers as TerraLoupe GmbH in Switzerland as a joint venture of cloudeo AG (with a majority) and Precision Landing GmbH.
“The geodata and earth observation industry has become a significant part of Precision Landing’s business over the last few years. The joint venture with the strategic partner cloudeo AG to acquire TerraLoupe’s business is a considerable step based on the corporate growth strategy and expanding Precision Landing into the geodata and EO industry,” said Christian Kling, Managing Partner of Precision Landing.
About cloudeo AG
Since 2012, cloudeo’s customers across industries can access multiple high-quality, ready-to-use data from various geodata sources like satellites, airplanes, and drones, in-situ-data, etc., under Data as a Service (DaaS) in a few clicks. Software as a Service (SaaS) allows cloudeo’s customers to use the latest version of different software types needed to process the geodata without worrying about updates, releases, etc.
Cloudeo’s platform also hosts a wide range of value-added service providers and developers who simplify and customizes the data to user-specific needs, analytics to make a meaningful insight. The best part is that all cloudeo services can be used without increasing user’s IT infrastructure costs but using cloudeo Infrastructure as a Service (IaaS). This service benefit works very well with the user specifications, code, workflow, and scaling up options as the user needs at a very affordable cost at your preferred location and time.
https://www.cloudeo.group/
About Precision Landing GmbH
Since 2004, Precision Landing serves its customers in strategic business consulting, business process management, and digitalization with the best practice approach across various industries, including the public sector.
For Further Information: www.precision-landing.com
Harisha Hangaravalli Marketing Manager cloudeo AG hhangaravalli@cloudeo.group
By bringing together maps, apps, data, and people, geospatial information allows everyone to make more informed decisions. By linking science with action, geospatial information enables institutes, universities, researchers, governments, industries, NGOs, and companies worldwide to innovate in planning and analysis, operations, field data collection, asset management, public engagement, simulations, and much more.
By bringing together maps, apps, data, and people, geospatial information allows everyone to make more informed decisions. By linking science with action, geospatial information enables institutes, universities, researchers, governments, industries, NGOs, and companies worldwide to innovate in planning and analysis, operations, field data collection, asset management, public engagement, simulations, and much more.
Areas of Engagement
Researchers use spatial information to trace urban growth patterns, access to mobility and transportation networks, analyze the impact of climate change on human settlements, and more. When spatial datasets are linked with non-spatial data, they become even more useful for developing applications that can make a difference. For instance, geospatial data coupled with land administration and tenure data can significantly impact urban planning and development by the landowners.
For example, a recent project by the Government of India, 'The Swamitva Project,' uses geospatial data to provide an integrated property validation solution for rural India. 'Swamitva,' which stands for Survey of Villages and Mapping with Improvised Technology in Village Areas, uses Drone Surveying technology and Continuously Operating Reference Station (CORS) technology for mapping the villages. The project aims to provide the 'record of rights' to village household owners, possessing houses in inhabited rural areas in towns, which would enable them to use their property as a financial asset for taking loans and other financial benefits from banks. The project stands to empower the rural peoples of India
Geospatial is also playing a crucial role in disaster management. By deploying geospatial data for all disaster management phases, including prevention, mitigation, preparedness, vulnerability reduction, response, and relief, significant disaster risk reduction and management can be achieved.
We all are aware that when a hail storm strikes, the damage can be catastrophic. In fact, with damage totals sometimes exceeding USD 1 billion, hailstorms are the costliest severe storm hazard for the insurance industry, making reliable, long-term data necessary to estimate insured damages and assess extreme loss risks.
That's why a team of NASA scientists is currently working with international partners to use satellite data to detect hailstorms, hail damage, and predict patterns in hail frequency. This project will provide long-term regional- to global-scale maps of severe storm occurrence, catastrophe models, and new methods to improve these storms' short-term forecasting.
"We're using data from many satellite sensors to dig in and understand when and where hailstorms are likely to occur and the widespread damage that they can cause," shares Kristopher Bedka, principal investigator at NASA's Langley Research Center in Hampton, Virginia. "This is a first-of-its-kind project, and we're beginning to show how useful this satellite data can be to the reinsurance industry, forecasters, researchers, and many other stakeholders."
Climate change is another area where research based on geospatial data is of extreme importance. The geospatial analysis not only provides visual proof of the harsh weather conditions, melting polar ice caps, dying corals, and vanishing islands, but also links all kinds of physical, biological, and socioeconomic data in a way that helps us understand what was, what is, and what could be. For instance, air quality is a public health issue that requires ongoing monitoring. Not only does air quality data provide information that can protect residents, but it also helps to monitor the overall safety of a geographic area. NASA uses satellites to collect air quality data on an ongoing basis. The satellites can evaluate air quality conditions near real-time and observe different layers and effects that may coincide.
Satellite data can reveal information like the aerosol index and aerosol depth (which indicates the extent to which aerosols are absorbing light and affecting visibility) in any given area. Other types of data that satellites collect include levels of carbon monoxide, nitrous oxide, nitric acid, sulphur dioxide, fires, and dust.
Near real-time data helps to warn residents of low air quality. It can also be used to determine how climate change impacts a geographic area and guides new infrastructure design with climate change in mind.
Thus, there is hardly any living area where geospatial data or location intelligence cannot significantly impact. The more one engages in geospatial research, the more fascinating the journey of discovery becomes.
COVID-19 and Geospatial Research
The current pandemic has made each one of us realize the importance of geospatial data all the more. Be it identifying the hotspots, taking corrective measures sooner, and curbing the spread of the virus. Geospatial research has enabled the authorities to make headway more effectively in all spheres.
The location has been the answer to most of the problems. Thus more and more companies are now engaging in the development of apps that can help trace the virus and help businesses and individuals recover faster. Accordingly, researchers are increasingly engaging in projects that could bring such significant geospatial products to life soon.
Research bottlenecks due to COVID
Projects that aim to harness location intelligence's power have high-end hardware, software, and data requirements. Researchers typically have a high-end infrastructure at their disposal. These systems are placed either in their workplace or the universities where they are pursuing their research projects. These research hubs are also their gateway to highly accurate geospatial data. In regular times, they do not need to look anywhere else. However, the scenario is completely changed due to the pandemic. With the universities remaining closed for a long time, researchers are more dependent on the online availability of good quality infrastructure and data. As a result, platforms providing Data as a Service (DaaS), Software as a Service (SaaS), Infrastructure as a Service (IaaS) solutions are becoming popular among the research community.
Meeting hardware requirements
Lockdown and closure of educational institutions have brought researchers face-to-face with the lack of high-end infrastructure required for processing geospatial data. In such a scenario, researchers rely more and more on platforms that can provide the convenience of Infrastructure as a Service (IaaS).
Hardware is becoming less and less essential in the age of cloud computing. Cloudeo's Infrastructure as a Service (IaaS) solutions can help the researchers streamline their hardware requirements, save costs, and increase their research work efficiency. With Cloudeo's Infrastructure as a Service, a researcher needs to pay only for what they need. This helps in minimizing investment in local infrastructure. They can also quickly and dynamically adapt the processing power or storage they need, spreading big processing jobs over many cores. This is something that they might not accomplish (or afford) with their physical hardware. Also, while using this service, all the data gets backed up securely in the cloud and remains protected from unexpected critical hardware failures.
Meeting geospatial data requirements
Researchers need access to platforms that can provide high-quality geospatial data from multiple sources at a low cost. Cloudeo's DaaS is emerging as an increasingly popular solution among the researchers for accessing highly valuable data from various sources, like suppliers of spaceborne, airborne, and UAV imagery and data. It's a cost-effective solution as the users do not have to buy permanent licenses for EO data integration, management, storage, and analytics. Data as a service is especially advantageous for short-term projects, where long-term or permanent licenses and data purchases can become cost-prohibitive.
Over the past few years, more and more Earth Observation (EO) data, software applications, and IT services have become available from an increasing number of EO exploitation platform providers – funded by the European Commission, ESA, other public agencies, and private investment.
For instance, ESA's Network of Resources (NoR) supports users in procuring services and outsourcing requirements while increasing uptake of EO data and information for broader scientific, social, and economic purposes. The goal is to support the next generation of commercial applications and services.
Cloudeo acts as the NoR Operator, together with its consortium partners RHEA Group and BHO Legal, by managing service providers' onboarding into the NoR Portal. The portal is a compilation of the NoR Service Portfolio, listing services on the NoR Portal, promoting the NoR services worldwide, and procuring such services for commercial users and ESA sponsorship.
Through NoR, cloudeo plays a vital role in improving and supporting education, research, and science. It is promoting community building by enabling collaboration between all stakeholders.
Research is crucial for the growth of an economy. As businesses mostly realize the importance of integrating geospatial with everyday affairs, research in the field of geospatial is gaining momentum. Accordingly, the need for access to high-quality geospatial data is also increasing. While most universities are equipped to meet these researchers' needs in regular times, with the current lockdowns and closure of institutions, the researchers are relying on robust, accurate online platforms that can meet their hardware, software, and geospatial data requirements. Cloudeo is one such reliable platform for accessing geospatial data from disparate sources. It can also complete the infrastructure requirements of researchers effectively. By bringing in all the data creators, data processors, and data users and solution/app developers onto one platform, cloudeo is creating the most user-friendly geospatial solutions marketplace to meet the infrastructure, software, and data needs of researchers.
Explore cloudeo today and take an essential step towards excelling in your research and academic endeavors. No one can tell you about everything spatial so accurately!
The farms of Southern Italy experience a particularly long Mediterranean growing season. The chief harvested crops of the fertile but seismically active Campania region are fruits, wine, tobacco, and flowers.
The agriculture industry is a pillar of the Campania economy. Here, the average farm is only 4 hectares in size but nevertheless highly productive. For larger farms, it can be challenging to measure how much and where more water is needed for optimal plant growth, even if they appear green in satellite imagery. Regardless of farm size, however, achieving optimal plant health is vital to a bountiful harvest.
The farms of Southern Italy experience a particularly long Mediterranean growing season. The chief harvested crops of the fertile but seismically active Campania region are fruits, wine, tobacco, and flowers.
The agriculture industry is a pillar of the Campania economy. Here, the average farm is only 4 hectares in size but nevertheless highly productive. For larger farms, it can be challenging to measure how much and where more water is needed for optimal plant growth, even if they appear green in satellite imagery. Regardless of farm size, however, achieving optimal plant health is vital to a bountiful harvest.
VegetationVitality uses scientifically validated and industry-accepted methods to deliver insightful information on the health of vegetation.
Analysis performed on the image reveals much more than a picture. Using light waves, NDVI, or normalized difference vegetation index, measures chlorophyll levels in leaves. When chlorophyll levels are at their maximum, plants are growing healthy and strong.
The darker green areas have the highest NDVI, indicating the healthiest plants. Tan and red areas either contain weaker plants, bare soil, or human settlements.
In this case, clouds in the water and snow on the mountain tops also appear red.
Powered by Harris Geospatial Services Framework (GSF) and HySpeed Computing, VegetationVitality also provides detailed analytical reports on the area of interest (AOI). The included report shows that in the Campania region, over 7000 sq. km is vegetated, but only 1000 sq. km are plants with high vitality at this time. On a farm, the light green areas could probably use more water, whereas the tan areas could be ready for planting.
The scale is adjustable to any AOI. Including the aforementioned NDVI index, there are a total of 6 different vegetation indices available to choose from – each offering advantages and insights to the health and condition of plants based on vegetation type and environmental conditions.
The Non-Linear Index (NLI), for example, is another Vegetation Vitality index available which provides amplified discrimination between healthy crops and other land-use types. Fields are clearly delineated in green, while infrastructure and bare fields are in red. In the following exemplary, a zoomed-in snapshot of the coastline with NLI shows us that not only is the area not very densely vegetated but well populated.
Your Vegetation Vitality data package comes delivered with images and shapefiles ready for immediate use or further analysis, as well as an easy to read PDF report which breaks down the results into helpful summaries, complete with graphs and legends. Orders start from 10€.
Click here to buy the product
Vegetation Vitality Benefits
Low cost and fast delivery
Current and past images
Subscribe on a time basis
Tailored results to your needs
No processing image yourself
No expensive software licenses
https://www.cloudeo.group/