<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[vis.gl - Medium]]></title>
        <description><![CDATA[Open-source, WebGL-powered visualization frameworks - Medium]]></description>
        <link>https://medium.com/vis-gl?source=rss----7d5390e55872---4</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Sat, 11 Apr 2026 05:26:00 GMT</lastBuildDate>
        <atom:link href="https://medium.com/feed/vis-gl" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[deck.gl v8.6 now available with deeper Google Maps support]]></title>
            <link>https://medium.com/vis-gl/deck-gl-v8-6-now-available-with-deeper-google-maps-support-b734719076a7?source=rss----7d5390e55872---4</link>
            <guid isPermaLink="false">https://medium.com/p/b734719076a7</guid>
            <category><![CDATA[google-maps]]></category>
            <category><![CDATA[location-intelligence]]></category>
            <category><![CDATA[geospatial]]></category>
            <category><![CDATA[webgl]]></category>
            <category><![CDATA[data-visualization]]></category>
            <dc:creator><![CDATA[Alberto Asuero]]></dc:creator>
            <pubDate>Tue, 12 Oct 2021 15:47:40 GMT</pubDate>
            <atom:updated>2021-10-12T17:08:04.428Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*z7vYDoQWEEJwF9g21xbvZg.png" /></figure><p>Today we are very pleased to announce the availability of <a href="https://deck.gl/">deck.gl</a> v8.6, which alongside a whole host of <a href="https://deck.gl/docs/whats-new">new features</a>, adds full support for the WebGL Overlay View feature of the Google Maps JavaScript API.</p><p><a href="https://mapsplatform.google.com/">Google Maps Platform</a> revolutionized the web mapping space when it was launched back in 2005 and has become the most popular framework for developing map applications on the web.</p><p>As the leading open source framework for data visualization, we recognize the importance for deck.gl to have strong support for Google Maps and in 2019 <a href="https://medium.com/vis-gl/using-deck-gl-with-google-maps-9c868d18e3cd">we kickstarted development</a> to ensure the two could work seamlessly together. This made it possible to use Google Maps as a raster basemap, but with this year’s addition of <a href="https://cloud.google.com/blog/products/maps-platform/using-new-webgl-powered-maps-features">support for WebGL overlays</a> we have taken it even further.</p><p>Luckily Google thinks the same and has been working hard alongside the <a href="https://carto.com/">CARTO</a> team to add much stronger deck.gl support for Google Maps by utilizing these new WebGL features.</p><p>Read the latest <a href="https://cloud.google.com/blog/products/maps-platform/richer-data-visualization-google-maps-platform-using-deckgl">blogpost from Google Maps Platform</a> to check out the great visualizations they have created to showcase deck.gl and Google Maps integration.</p><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FXChGDHUtQCA%3Ffeature%3Doembed&amp;display_name=YouTube&amp;url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DXChGDHUtQCA&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FXChGDHUtQCA%2Fhqdefault.jpg&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;type=text%2Fhtml&amp;schema=youtube" width="854" height="480" frameborder="0" scrolling="no"><a href="https://medium.com/media/5907d840d4ff0314661c0dd8acc324c9/href">https://medium.com/media/5907d840d4ff0314661c0dd8acc324c9/href</a></iframe><p><strong>What’s new?</strong></p><p>This new release adds support for the vector basemap in interleaved mode. This means that with deck.gl, Google’s vector map can now be mixed with data layers, providing a pixel perfect composition where labels, 3D, and other content are respected and rendered perfectly with depth and occlusion.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1016/0*Rz7B9EYb0iYmjcq9" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/889/0*wt7gbDoUmjRqz4cV" /></figure><h3>Full vector rendering support with backwards compatibility</h3><p>While implementing support for vector rendering, we worked hard to maintain compatibility with the existing <a href="https://github.com/visgl/deck.gl/blob/master/docs/api-reference/google-maps/google-maps-overlay.md">GoogleMapsOverlay</a> class. So you can continue using it as before but with the following enhancements:</p><ul><li>Shared 3D space: objects drawn by the GoogleMapsOverlay class appear inside the Google Maps scene, correctly intersecting with 3D buildings and behind the contextual labels drawn by Google Maps.</li><li>Tilting and rotating the view is supported.</li><li>Rendering uses the same WebGL context as Google Maps, improving performance.</li></ul><h3>Getting started with deck.gl and Google Maps Platform</h3><p>The basics are simple. Create a deck.gl Layer and add it to a GoogleMapsOverlay object and finally add that overlay to your Google Maps Map object.</p><p>You start by loading deck.gl 8.6 and the Google Maps library. The WebGL Overlay View of Google Maps is currently only available on the beta channel. You need to specify your API key and a Map ID with vector maps enabled. For more details check out <a href="https://developers.google.com/maps/documentation/javascript/webgl/webgl-overlay-view">the official documentation</a>.</p><pre>&lt;script src=”<a href="https://unpkg.com/deck.gl@8.6.0/dist.min.js">https://unpkg.com/deck.gl@8.6.0/dist.min.js</a>&quot;&gt;&lt;/script&gt;&lt;script src=”https://maps.googleapis.com/maps/api/js?key=xx&amp;v=beta&amp;map_ids=xxx&quot;&gt;&lt;/script&gt;</pre><p>The next thing is to create the Google Map and a deck.gl Layer.</p><pre>// Create a Google Map<br>const map = new google.maps.Map(document.getElementById(&quot;map&quot;), {<br>  center: {lat: 50, lng: 14},<br>  tilt: 30,<br>  gestureHandling: &#39;greedy&#39;,<br>  mapId: &#39;fae05836df2dc8bb&#39;,<br>  zoom: 3<br>});</pre><pre>// Create a deck.gl visualization layer<br>const flightsLayer = new deck.ArcLayer({<br>  id: &#39;flights&#39;,<br>  data: AIR_PORTS,<br>  dataTransform: d =&gt; d.features.filter(f =&gt; f.properties.scalerank &lt; 4),<br>  getSourcePosition: f =&gt; [14.42076, 50.08804], // Prague<br>  getTargetPosition: f =&gt; f.geometry.coordinates,<br>  getSourceColor: [0, 128, 200],<br>  getTargetColor: [0, 0, 80],<br>  getWidth: 1<br>});</pre><p>And finally create a GoogleMapsOverlay that will be added to the map.</p><pre>const overlay = new deck.GoogleMapsOverlay({<br>  layers: [flightsLayer]<br>});</pre><pre>overlay.setMap(map);</pre><p><a href="https://jsfiddle.net/carto/3xhrt6nf/">Live demo in JSfiddle</a></p><p>To learn more and see a full set of example visualizations check out this link: <a href="http://jsfiddle.net/user/felixp/fiddles/">http://jsfiddle.net/user/felixp/fiddles/</a></p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*jhdzKzKkd1wd8qKr" /></figure><h3>Other improvements coming on deck.gl 8.6</h3><p>There are <a href="https://deck.gl/docs/whats-new#deckgl-v86">many</a> other improvements available in this version of deck.gl. Here are some of the highlights:</p><ul><li>CartoLayer adds new geoColumn and columns props, enables more granular data fetching from CARTO backend.</li><li>ColumnLayer and GridCellLayer add radiusUnits prop.</li><li>H3HexagonLayer now supports manually forcing low-precision, high-performance rendering with highPrecision: false.</li><li>HeatmapLayer adds weightsTextureSize and debounceTimeout props for fine-tuning performance.</li><li>MVTLayer now defaults to handling geometries in binary.</li><li>Scatterplot and GeoJsonLayer add option to turn off antialiasing to avoid artifacts in depth oclusion.</li><li>TileLayer no longer purges its cache when data changes, resulting in a smoother “reload” experience</li></ul><p>Thank you very much to all of the developers who have contributed to this latest release.</p><h3>Welcome Google Maps developers!</h3><p>The vision of deck.gl is to provide the best open source, open governance visualization library for large-scale datasets. CARTO and Google Maps have agreed to maintain and support this connection between deck.gl and Google Maps for the years to come.</p><p>Many organizations are now collaborating to make this project a success and that’s why we believe our <a href="https://medium.com/vis-gl/deck-gl-8-2-moves-to-open-governance-379f147c15bb">open governance</a> model is so important.</p><p>So stay tuned for more layers and features, and please share with us what you build with deck.gl 8.6.</p><p>We can’t wait to see what you build!</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=b734719076a7" width="1" height="1" alt=""><hr><p><a href="https://medium.com/vis-gl/deck-gl-v8-6-now-available-with-deeper-google-maps-support-b734719076a7">deck.gl v8.6 now available with deeper Google Maps support</a> was originally published in <a href="https://medium.com/vis-gl">vis.gl</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[deck.gl 8.2 moves to Open Governance]]></title>
            <link>https://medium.com/vis-gl/deck-gl-8-2-moves-to-open-governance-379f147c15bb?source=rss----7d5390e55872---4</link>
            <guid isPermaLink="false">https://medium.com/p/379f147c15bb</guid>
            <category><![CDATA[webgl]]></category>
            <category><![CDATA[visualization]]></category>
            <category><![CDATA[open-source]]></category>
            <category><![CDATA[javascript]]></category>
            <dc:creator><![CDATA[Ib Green]]></dc:creator>
            <pubDate>Fri, 07 Aug 2020 15:01:01 GMT</pubDate>
            <atom:updated>2020-08-07T17:36:45.734Z</atom:updated>
            <content:encoded><![CDATA[<p><a href="https://deck.gl">deck.gl</a>, which has been developed under the stewardship of Uber’s Engineering organization over the last 5 years, has now taken a big step and moved to an open governance model. In addition, deck.gl 8.2, the first community planned version of deck.gl, has now been released.</p><p>This blog post shares some thoughts about why the move to open governance was so important, and gives some glimpses of the exciting public roadmap that was presented.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1000/0*kbcsaNTmOSy46jdZ" /><figcaption>deck.gl 8.2, the first community planned deck.gl release, is now available. The image shows the new globe visualization mode in deck.gl 8.2.</figcaption></figure><h3>Why Open Governance?</h3><p>Many great open source software projects, while free to use and open to contribution, are still controlled by one company or a small group of people. This is often fine, as the owners are also typically the maintainers and tend to work hard (usually for free) to support their communities.</p><p>But once an open software project gets to the point where multiple companies depend on it for their own commercial products, and the project starts to receive major development contributions from those companies, the introduction of an open governance model can make a big difference in removing barriers to adoption.</p><p>The goal for moving vis.gl to open governance was to create an even more inviting playing field for anyone who wants to use or contribute to deck.gl. Contributors can now become involved in the decision making process, and those that make significant contributions can even get seats on the steering committee and get a say in all vis.gl-related program matters.</p><h3>An Open Planning Process</h3><p>A key aspect of open governance is to let the community take part in the planning of upcoming software releases, and the first community planning meeting for the deck.gl 8.2 release was held in early May 2020.</p><p>The various vis.gl tech leads took turns in sharing the plans for the <a href="http://docs.google.com/presentation/d/1MZbZrzcCB3THwM0KeSDs7VK5eczCp4IkO7U84mHhebY/edit#">upcoming deck.gl 8.2 and 8.3 releases</a>, as well as longer term roadmaps.</p><p>The planning meeting was well attended, with representatives from a number of leading geospatial software companies, and the feedback from the audience was very positive.</p><h3>Highlights from the community planning meeting</h3><p>Advanced TileLayer development is a major focus area for deck.gl 8.x</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/667/0*jwojDj_9_XStJdBI" /><figcaption>Gigapixel zoom</figcaption></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/640/0*tsGQ82tEj8qNc7SJ" /><figcaption>TerrainLayer</figcaption></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/600/0*7mXkI9p0l9BoMqZi" /><figcaption>I3S tile set rendering a textured 3D mesh model of San Francisco</figcaption></figure><p>Support for tiled data has seen a lot of development over the last year, and the deck.gl + loaders.gl combination now provides a comprehensive support for tiled data layers including 2D tiles (geospatial and non-geospatial), 3D tiles (both the 3D Tiles and I3S OGC standards are supported), as well as terrain from tiled elevation data sources etc.</p><p>Also many “tricky corner cases” that arise when visualizing tiled data, such as cross-tile highlighting, working with high pitch views, request throttling when loading multiple tiled data sources etc are now handled correctly by deck.gl.</p><h3>Longer Term Roadmaps (deck.gl v9)</h3><p>The open governance meeting also gave the tech leads a good opportunity to provide the community with a glimpse of features that are being considered for the next major releases of the frameworks, such as:</p><p>Improved support for non-JavaScript programming languages</p><ul><li><a href="https://pydeck.gl">pydeck</a></li><li><a href="https://github.com/UnfoldedInc/deck.gl-native">deck.gl-native</a> (C++ port)</li><li>Swiftdeck, Javadeck for mobile</li><li>Language independent styling specification, cross-language transport protocol</li></ul><p>Core Features</p><ul><li>Globe projections</li><li>Advanced Tile Layer Features</li><li>Typescript, WebGPU, ES Modules</li></ul><p>These are just a few highlights, for those interested in the details, the slides from the presentation are linked at the end of this post.</p><p>Also check out the <a href="https://deck.gl/docs/whats-new">deck.gl 8.2 Whats’ New</a> page as it details the outcome of the first community planned release.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*aIyDZv2iGEPozUXi" /><figcaption>A new member of the deck.gl family, <a href="https://deck.gl">pydeck</a> brings deck.gl to the Python data science community</figcaption></figure><h3>Join the vis.gl movement!</h3><p>A goal of the moving deck.gl and the other vis.gl frameworks to an open governance model is of course to open the doors to additional participation. We want to grow the vis.gl community, and if you are interested don’t hesitate to reach out on the channels below. We look forward to hearing from you!</p><ul><li><a href="https://medium.com/vis-gl">vis.gl blog</a> (Medium)</li><li><a href="https://join.slack.com/t/deckgl/shared_invite/zt-7oeoqie8-NQqzSp5SLTFMDeNSPxi7eg">deck.gl slack room</a></li><li><a href="https://deck.gl/issues">deck.gl GitHub issues</a></li><li><a href="http://docs.google.com/presentation/d/1MZbZrzcCB3THwM0KeSDs7VK5eczCp4IkO7U84mHhebY/edit#">vis.gl community planning meeting slides</a></li><li>join <a href="https://lists.uc.foundation/g/visgl">vis.gl announcement mailing list</a> (low volume)</li><li><a href="https://github.com/visgl/tsc">vis.gl technical charter</a></li><li><a href="https://deck.gl">deck.gl website</a></li></ul><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=379f147c15bb" width="1" height="1" alt=""><hr><p><a href="https://medium.com/vis-gl/deck-gl-8-2-moves-to-open-governance-379f147c15bb">deck.gl 8.2 moves to Open Governance</a> was originally published in <a href="https://medium.com/vis-gl">vis.gl</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[pydeck: Unlocking deck.gl for use in Python]]></title>
            <link>https://medium.com/vis-gl/pydeck-unlocking-deck-gl-for-use-in-python-ce891532f986?source=rss----7d5390e55872---4</link>
            <guid isPermaLink="false">https://medium.com/p/ce891532f986</guid>
            <category><![CDATA[python]]></category>
            <category><![CDATA[data-science]]></category>
            <category><![CDATA[data-visualization]]></category>
            <category><![CDATA[gis]]></category>
            <category><![CDATA[maps]]></category>
            <dc:creator><![CDATA[Andrew Duberstein]]></dc:creator>
            <pubDate>Tue, 15 Oct 2019 21:29:09 GMT</pubDate>
            <atom:updated>2019-10-15T21:29:09.120Z</atom:updated>
            <content:encoded><![CDATA[<p>Uber’s open source <a href="https://deck.gl/">deck.gl</a> library powers both internal data visualization tools and brings to life award-winning visualizations from across the community.</p><p>Today, we’re excited to announce that we’ll be bringing deck.gl’s declarative syntax, the ease of use, and ability to render massive data to Python with the beta release of <a href="https://pypi.org/project/pydeck/">pydeck</a>, a set of Python bindings for deck.gl. In this post, we’ll cover pydeck’s unique features, how it works beneath the hood, and how you can get started using it today.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*86RzvaLsplcxRoJG" /></figure><p>The mission of pydeck is to let Python users create deck.gl maps without having to know a lot of Javascript. At its core, pydeck focuses on data analytics use cases, so it works best in environments used by data professionals, like Jupyter Notebooks. With pydeck, users can embed visualizations interactively in a Jupyter Notebook or simply export them to a standalone HTML file.</p><p>A few mapmaking libraries exist within the Python ecosystem, so it’s worth highlighting the unique features that pydeck provides which differentiate it:</p><ul><li>The availability of the full deck.gl layer catalog in Python</li><li>Support for large-scale updates, like color changes or data modification, to hundreds of thousands of visualized data points</li><li>Two-way communication, where data selected in a visualization can be passed back to a Jupyter Notebook’s kernel</li><li>The ability to map hundreds of thousands of data points in 2D and 3D via a Python API</li></ul><figure><img alt="" src="https://cdn-images-1.medium.com/max/620/1*wJAUM3rY_mQCD-_fYWqS7g.gif" /><figcaption>2010 San Francisco building polygons, colored by elevation</figcaption></figure><p>Under the hood, pydeck converts its Python objects to JSON and passes that JSON to the new @deck.gl/json API. This library interprets JSON objects into deck.gl layers, letting users render a visualization written in Javascript without having to know Javascript itself. Pydeck augments this by also providing components for Jupyter Notebook integration and convenience functions for data processing.</p><p>You can get started using pydeck with a simple copy and paste in your terminal:</p><pre>pip install pydeck</pre><p>To see pydeck in action, you can copy and paste this Python code into a Jupyter Notebook or an IPython terminal:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/6eab9a059048d7d26031465fd3642ac5/href">https://medium.com/media/6eab9a059048d7d26031465fd3642ac5/href</a></iframe><p>You can also check out <a href="https://mybinder.org/v2/gh/uber/deck.gl/binder?filepath=examples">the hosted examples on mybinder.org</a>, where you can use pydeck in your browser without installing it. For further details while coding, review the documentation home page <a href="https://deckgl.readthedocs.io/en/latest/?">here</a>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/998/1*W1JaZGijTa7nTeF_YK--ZQ.gif" /><figcaption><a href="https://en.wikipedia.org/wiki/Conway%27s_Game_of_Life">Conway’s Game of Life</a> in pydeck</figcaption></figure><p>You can file <a href="https://github.com/uber/deck.gl/issues/new?assignees=&amp;labels=bug&amp;template=bug-report.md&amp;title=">bugs</a> and submit <a href="https://github.com/uber/deck.gl/issues/new?assignees=&amp;labels=feature&amp;template=feature-template.md&amp;title=">feature requests</a> with our team on GitHub. Contributions are both welcome and encouraged, not only to improve pydeck but also to bindings for other languages via the @deck.gl/json library.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=ce891532f986" width="1" height="1" alt=""><hr><p><a href="https://medium.com/vis-gl/pydeck-unlocking-deck-gl-for-use-in-python-ce891532f986">pydeck: Unlocking deck.gl for use in Python</a> was originally published in <a href="https://medium.com/vis-gl">vis.gl</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Exploring Geospatial data with kepler.gl]]></title>
            <link>https://medium.com/vis-gl/exploring-geospatial-data-with-kepler-gl-cf655839628f?source=rss----7d5390e55872---4</link>
            <guid isPermaLink="false">https://medium.com/p/cf655839628f</guid>
            <category><![CDATA[geospatial]]></category>
            <category><![CDATA[uber]]></category>
            <category><![CDATA[data-science]]></category>
            <category><![CDATA[keplergl]]></category>
            <category><![CDATA[data-visualization]]></category>
            <dc:creator><![CDATA[Shan He]]></dc:creator>
            <pubDate>Mon, 26 Aug 2019 16:37:44 GMT</pubDate>
            <atom:updated>2019-08-26T22:36:12.136Z</atom:updated>
            <content:encoded><![CDATA[<h3>Exploring Geospatial Data with kepler.gl</h3><p>Co-authors: <a href="https://medium.com/@gabriel_52469">Gabriel Durkin</a>, <a href="https://medium.com/@sina.rk">Sina Kashuk</a></p><p>kepler.gl is an advanced geospatial visualization tool open sourced by Uber’s visualization team in 2018 and contributed to the Urban Computing Foundation in early 2019.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*f2CkYUqaUi2HsB77zGnI5Q.png" /><figcaption>Figure 1. Using kepler.gl to visualize San Francisco building footprint</figcaption></figure><p>At Uber, kepler.gl is the de facto tool for geospatial data analysis. In a <a href="https://medium.com/vis-gl/introducing-kepler-gl-for-jupyter-f72d41659fbf">previous article</a>, we introduced kepler.gl for Jupyter Notebook. In this article, we want to showcase how data scientists at Uber use kepler.gl to understand massive amounts of aggregated geospatial data and derive insights that improve our business. All the analysis presented in this blog post is based on data aggregated by <a href="https://eng.uber.com/h3/">H3, Uber’s open source geospatial indexing system</a>, with an aperture equal to 12, for the locations with a minimum of 100 trips counts using at least 6 months of data.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*X9aA5ZRLoidiOEoBi7ywTA.png" /><figcaption>Figure 2. Maps… without maps. — Toronto request data with kepler.gl derived from aggregated rider GPS signals</figcaption></figure><p>Uber’s platform leverages digital solutions to tackle transportation problems in the physical world, such as ridesharing and meal delivery. Gabriel Durkin and Sina Kashuk, data scientists from Uber’s Rider Geospatial Intelligence team, leverage kepler.gl to analyze trip data, specifically to understand the real-world challenge of driver-partners and riders locating each other for pick-up in a complex cityscape. Figure 2, above, illustrates how the projection of the pick-up data at high resolution can create a map of the city of Toronto based entirely on usage of the Uber app, without leveraging a single base map.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*4ndPbdSGm0r_WNYhfjFMQg.gif" /><figcaption>Figure 3. Visualizing highest concentrations of requests in New York City over a 24 hour period</figcaption></figure><p>The pick-up process of an Uber ride or Uber Eats meal is one of Sina and Gabriel’s biggest data science pain points. There are many geospatial challenges associated with the pick-up process, and our teams frequently use kepler.gl to inform the development of geospatial solutions to improve this part of the ridesharing and delivery experiences. A common visualization rendered for this type of problem solving is a time lapse animation, as depicted in Figure 3, above, to identify temporal trends and areas with a higher concentration of trip requests, which may be correlated with suboptimal pick-up experiences.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*ujHope79RZ1WPS8JrwH5EQ.png" /><figcaption>Figure 4. Success Rate: blue indicates successful (and red defective) pick-ups in San Francisco</figcaption></figure><p>Sina and Gabriel also created a ‘success’ metric that identifies places associated with quality pick-up experiences, i.e., spots with a minimum of cancellations, pick-up location errors or other types of defects. These were projected onto city maps as hexagonal spatial units visualized in kepler.gl using the H3 layer (Figure 4), and ingested by our pick-up spot recommendation engine, so that the future suggested pick-up locations are far from ‘low success’ areas.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*BMMhPeiN_Mq-Bl_JAD_B6w.png" /><figcaption>Figure 5. Creation of Automated pick-up and Dropoff zone based on the historical trip experience — Visualizing areas that have a high fraction of canceled trips — Numbered circles annotate cluster centers so upper and lower maps can be compared</figcaption></figure><p>This success metric can equally be mapped to the rider device request location (likely inside a building) rather than the “in the street” vehicle pick-up location. Two complementary maps are created, as depicted in Figure 5, for understanding the pick-up experience: (top) one depicting hexagonal heatmaps of Uber metrics and (lower) one depicting the physical landscape of buildings and city-block polygons for the same locations. The finalized maps, like those in Figure 5, are saved to an interactive HTML file that is shared with regional Uber Operations teams to flag and address complex or problematic pick-up areas. Numerical labels are applied to annotate the densest areas, and geofences are created around them (as depicted by the polygons in the lower plot). When requesting a trip in these ‘enhanced’ pick-up zones, riders will be given additional instructions to guide them through the process.</p><p>Another kepler.gl metric Gabriel and Sina assessed when working on ways to improve the Uber pick-up experience was the estimated time of arrival (ETA) error. When users request a ride on the Uber platform, they expect the ETA of their driver to map as accurately as possible to our in-app calculations. ETA errors will result in poor user experiences. The two data scientists created maps to identify areas with higher ETA errors, allowing teams to better understand where the algorithms that produce the estimates are performing well and where they need to improve.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*04E7z_uxsSm--7BNH9FRbg.gif" /><figcaption>Figure 6. Comparing one week of ETA (left, red is high ETA) and Request volume (right, yellow is high density) in San Francisco</figcaption></figure><p>For example, in the very early morning, there seems to be a longer ETA in the Northeast, evidencing sparsity of vehicle supply (Figure 6). The lack of clustering and low density of requests throughout suburban neighborhoods clearly presents a challenge for dispatching new trips at that time of day.</p><p>The dual map in Figure 6, above, uses kepler.gl to discover correlations between arrival times and the volume of Uber requests in San Francisco. Both maps show places of highest request volume but the left map is colored by ETA (long ETAs in red, short ETAs in blue), and the right map is colored by request volume (high volume in yellow).</p><p>Gabriel and Sina are working with product managers on the Rider Team to build solutions that have spatial and temporal context-awareness. One possibility would be a notification sent to the rider at rush-hour, e.g. “Hey, you are in a busy area. You should request a few minutes in advance to avoid longer delays.” Insights derived via the lens of kepler.gl will drive the design of potential new Uber rider app geo-contextual features.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*7zJvZNz7N5TrWa-W4HOM0w.png" /><figcaption>Figure 7. LEFT — Heterogeneity: Number of distinct riders divided by the number of trips aggregated in each hexagon &amp; MIDDLE — ETA errors: green is low, purple is high &amp; RIGHT- Avg trip distance: Average distance traveled of trips requested: darker green is longer</figcaption></figure><p>kepler.gl also provides a lens to bring clarity to the understanding of the movements and travel behaviors of people in cities. Figure 7 shows three different map layers of kepler.gl for Manhattan. The left image projects place-heterogeneity — places with a higher ratio of riders to trips are indicated in yellow (Fig.7 LEFT). The heterogeneity, or public/private metric, usually indicates whether an area is more public and populated by visitors and tourists. For instance, this metric is close to 1.0 at airports because each rider typically takes a single trip from the airport, compared with &lt; 0.6 at residential neighborhoods where the same rider may request multiple trips, e.g., every weekday morning to work. In Figure 7, these regions appear dark red. Challenging pick-ups can result from a rider’s unfamiliarity with a new place, rather than any intrinsic property of the place itself. The heterogeneity metric projected via kepler.gl highlights these cases geospatially.</p><p>The middle image of Manhattan represents ETA errors (Fig.7 MIDDLE) and suggests that more isolated locations (near the waters’ edge) lead to greater uncertainty in the estimated time of arrival of the dispatched cars.</p><p>By projecting ‘completed trip distance’ onto the geo-location of requests in kepler.gl (Fig.7 RIGHT) additional geographical constraints imposed by the waterways surrounding Manhattan are apparent. For example, trips beginning at its southernmost tip tend to be longer, in part because riders must travel North to get anywhere.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*xqaNw3bbVtYk0TbIVG0rQg.png" /><figcaption>Figure 8. Using kepler.gl in Jupyter Notebook to visualize geospatial data</figcaption></figure><p>The Jupyter Notebook platform combined with kepler.gl (Figure 8) allows data scientists like Sina and Gabriel to identify and develop an understanding of the geospatial nature of Uber’s data — that data patterns and clustering is apparent via kepler.gl maps is proof that the ‘prior art’ of collecting data in tables and spreadsheets alone ignores the confounding influence of its spatial context.</p><p>Using kepler.gl, Gabriel and Sina were able to gain new insights, consistently increase pick-up quality, and improve rider experience across all the cities and continents where Uber operates.</p><p>Now that <a href="https://medium.com/vis-gl/introducing-kepler-gl-for-jupyter-f72d41659fbf#targetText=Introducing%20kepler.gl%20for%20Jupyter&amp;targetText=kepler.gl%20is%20an%20advanced,tool%20for%20geospatial%20data%20analysis.">kepler.gl for Jupyter</a> is open sourced, we are eager to see how it is adopted by other data scientists working in the geospatial domain and learn about inventive new uses cases developed by the community in the coming months.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=cf655839628f" width="1" height="1" alt=""><hr><p><a href="https://medium.com/vis-gl/exploring-geospatial-data-with-kepler-gl-cf655839628f">Exploring Geospatial data with kepler.gl</a> was originally published in <a href="https://medium.com/vis-gl">vis.gl</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Vis Hackathon 2019]]></title>
            <link>https://medium.com/vis-gl/vis-hackathon-2019-56096209dce2?source=rss----7d5390e55872---4</link>
            <guid isPermaLink="false">https://medium.com/p/56096209dce2</guid>
            <category><![CDATA[webgl]]></category>
            <category><![CDATA[visualization]]></category>
            <category><![CDATA[maps]]></category>
            <category><![CDATA[front-end-development]]></category>
            <category><![CDATA[data-visualization]]></category>
            <dc:creator><![CDATA[Jérôme Cukier]]></dc:creator>
            <pubDate>Tue, 23 Jul 2019 16:37:45 GMT</pubDate>
            <atom:updated>2019-07-23T16:37:45.669Z</atom:updated>
            <content:encoded><![CDATA[<p>The visualization engineering team at Uber is made up of dozens of technologists with very different specialties — mapping, web, data, low-level graphics, ML, you name it. Many of us fell in love with visualization by experimenting on our own and through the joy of making stuff. And with the team so large and so diverse we don’t always have a chance to work with each other!</p><p>Two excellent reasons to launch our second Vis Hackathon, in which 11 teams from across Uber Vis participated. We had two themes this time:</p><ul><li>Using the new <a href="https://www.uber.com/newsroom/movement-street-speeds/">speed datasets</a> from <a href="https://movement.uber.com/">movement.uber.com</a>,</li><li>Showcasing new features from our frameworks like <a href="https://deck.gl/#/">deck.gl,</a> <a href="https://loaders.gl/">loaders.gl</a>, <a href="https://luma.gl/#/">luma.gl</a>, etc.</li></ul><p>Without further ado, here’s what we did!</p><h3>Urban Symphony</h3><p>Jon Sadka</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*pmHZqA1AA6Gu6Yh_Yl8nOg.gif" /></figure><p>Close your eyes for a moment and imagine what it’s like sitting in a quickly accelerating car. Chances are that part of this imagination includes sound.</p><p>Sound is a natural byproduct of movement and speed, yet, by their nature, most visualizations about speed only express data visually. Urban Symphony is an experiment to see what happens when we used speed data to generate a melody for a given path of road segments.</p><h3>Sparrow</h3><p>Wesam Manassra, Ed Barwani, Adam Kidder, Ib Green<br><em>Best use of city data prize</em></p><figure><img alt="" src="https://cdn-images-1.medium.com/max/850/1*pdrWVloYtwqGBTXug8oMCA.gif" /></figure><p><a href="https://arrow.apache.org/">Apache Arrow</a> is a columnar in-memory data format that is interoperable across different languages, and makes it efficient to transport and scan large datasets without the need for serialization/deserialization on servers/clients.</p><p>We unlocked the powers of Arrow by building Sparrow, a tool to visualize a year’s worth of hourly speed data in the browser dynamically and efficiently (around 130 million speed readings). We built a lightweight Python server to read the dataset partitioned by day and stream it over HTTP. We can then visualize hourly speed data in the browser as soon as it arrives from the server. As the user scrubs to a certain date, data is loaded and flushed dynamically to maintain an upper bound on the total memory used by the browser.</p><p>This approach allows us to visualize very large data sets with low latency. As a future exploration, we’d like to stream this data directly to the GPU to enable even better performance.</p><h3>Traffic Flow</h3><p>Javid Hsueh</p><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FpdQtSha2W5M%3Ffeature%3Doembed&amp;url=http%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DpdQtSha2W5M&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FpdQtSha2W5M%2Fhqdefault.jpg&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;type=text%2Fhtml&amp;schema=youtube" width="640" height="480" frameborder="0" scrolling="no"><a href="https://medium.com/media/6a920f1370b167af5f81024875605340/href">https://medium.com/media/6a920f1370b167af5f81024875605340/href</a></iframe><p>This project simplifies the road network and visualizes the traffic flow of the city. The junctions will move toward the major traffic direction at each hour.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/766/1*WQvGUIwWfI4Ko3Kn55wglQ.gif" /></figure><p>The two main challenges of this project were the road network simplification and using transform feedback in a custom <a href="https://deck.gl">deck.gl</a> layer to animate the flow on the edges.</p><h3>ML Speeds Limit</h3><p>David Schnurr, Ben Kramer</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*CCPiKs-mCbnh4ugdcKEWYA.png" /><figcaption>Predicted speed limits from selected San Francisco streets based on Movement Speeds Data</figcaption></figure><p>Basemaps like OSM often have inconsistent availability of speed limit data. Can we build a supervised machine learning model that utilizes Movement Speeds data to predict speed limits on roads where it’s not available?</p><p>Our team had limited ML experience, so a lot of time was spent learning about model selection (we settled on a decision tree regressor) and feature engineering.</p><p>The accuracy of the final model was actually relatively decent, and we think given extra time its accuracy could be improved even further with better feature engineering, hyper-parameter tuning, and testing deep learning approaches.</p><h3>SpeedsUp</h3><p>Bryant Luong, Lezhi Li</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*uucKjxfK0e4RECwoHr7vgQ.png" /></figure><p>In this project, we tried to discover daily speed patterns of various streets in a city using machine learning on Movement Speeds data. On the right, we represent streets as time series of speed over the course of the day. We can then cluster these time series by similarity — the streets which are always fast, always slow, fast and slow at similar times, etc.</p><p>We used Apache Arrow to load a very large dataset in the browser and <a href="https://www.tensorflow.org/">TensorFlow.js</a> for efficient in-browser machine-learning.</p><h3>Really Fast Rectangles</h3><p>Brian Ford</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/685/1*IHi1VguJVptv6u1SehZKnA.gif" /></figure><p>Can aggregation and drawing functions based on the incremental lambda calculus be a practical way to speed up data visualization programs? To find out, we wrote one data visualization using d3.js and another one using incremental functions, and then benchmarked the two approaches.</p><p>While our intuition is correct and that lambda calculus can yield drastic performance improvement, we also found that writing aggregations in terms of changes is surprisingly difficult, and that not all types of aggregation benefit from this approach.</p><h3>How to Animate Anything</h3><p>Shan He<br><em>Grand prize</em></p><figure><img alt="" src="https://cdn-images-1.medium.com/max/800/1*8YNnAY3BsXCDiIbotY3o0Q.gif" /><figcaption>Heights and colors are animated in kepler.gl</figcaption></figure><p>In this project, we wanted to experiment with deck.gl’s attributes transition in kepler.gl to visualize speed changes over time.</p><p>Animation is, at its core, the transition of visual attributes (position, color, radius, thickness) based on time. In kepler.gl, layer visual attributes can be encoded by selected field values. However, there is currently no way to animate these attributes based on time: e.g., you can draw road segments and color them by speed, but you can’t animate the colors based on speed changes over time.</p><p>The way we allow the animation of visual attributes (position, color, radius, etc) is based on an additional time table. The user can select a dataset to draw the shape (road segments), and another dataset that contains a time table (speed per segments per hour). They can then join these two data tables and enable animation for visual attributes that are based on value (speed, # of cars, # of passengers, etc).</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/800/1*ug3mrOl4u65NaWR3HFNijQ.gif" /><figcaption>Joining an attribute dataset with a time dataset</figcaption></figure><h3>Cinematic kepler.gl</h3><p>Chris Gervang, Chun Jiang, Alison Lee</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*icCoqX0ohOlz5CU6C3SmXQ.png" /><figcaption>Building animation tooling into the kepler.gl UI</figcaption></figure><p>This project provides the ability to sequence animations and generate videos in <a href="https://kepler.gl">kepler.gl</a> using an easy magic keyframe UI. It integrates a WIP video library, hubble.gl, into kepler.gl delivering instant videos, right to your downloads! It demonstrates the new <a href="https://luma.gl">luma.gl</a> Timeline and KeyFrames classes.</p><p>We were challenged with striking a balance between simplicity and functionality in the UI design, by writing new UI components into <a href="https://kepler.gl">kepler.gl</a>, and by figuring out how to integrate the necessary parts of <a href="https://luma.gl">luma.gl</a>, <a href="https://deck.gl">deck.gl</a>, <a href="https://kepler.gl">kepler.gl</a>, and hubble.gl together.</p><h3>Citymorph</h3><p>Xiaoji Chen, Jerome Cukier<br><em>Best use of frameworks prize</em></p><figure><img alt="" src="https://cdn-images-1.medium.com/max/600/1*ameO006OZepu3sxWrNMoWw.gif" /></figure><p>From the Movement Speeds dataset, we can deduce how much time it takes, on average, to traverse street segments at various times of the day. We can use that data to morph the maps of cities and from one given point, reposition all street intersections according to how much time it would take to go there.</p><p>One interesting aspect of this project is that we did the required graph traversal directly in the GPU.</p><p>(The interactive version of this project will be coming shortly to the <a href="https://deck.gl/#/showcases/overview">deck.gl showcase</a>.)</p><h3>Hex Hedgehogs</h3><p>Nick Rabinowitz, Ib Green</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/640/1*M5o2t7ElLMOeyzgeFLCreg.gif" /></figure><p>The Hex Flow project was intended to prototype visualization approaches for city-level trip flow patterns, using the <a href="https://uber.github.io/h3/#/">H3 library</a> and <a href="https://deck.gl/#/">deck.gl</a>. Using several weeks of anonymized routes from <a href="http://jump.com">Jump Bike</a> data, we aggregated flow patterns by hour of week for every H3 cell in the city, then built an interface that could display flow as an animated vector field.</p><p>The primary challenge was to aggregate tens of thousands of routes into a format that was both performant and visually comprehensible. Our final version included both offline and runtime aggregation, including multi-pass smoothing, to generate an H3-based vector field for each time slice and interpolate between them.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*lE8NKRF7OVSleABQhv0O8A.png" /></figure><p>Validating the data processing and visualization would have been very difficult without knowledge of the city in question. We knew we were on the right track when we could see the clear bike commute patterns up and down Market Street at appropriate times!</p><h3>Map Collage</h3><p>Yang Wang, Jian Huang, Ravi Akkenapally</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*KeamuHWMol7F0kiZu_YrBQ.png" /></figure><p>Map Collage is like photo collage, but (wait for it) for maps.</p><p>This consolidates multiple sub-maps into one canvas with dynamic boundaries generated by the underlying data. It frees you from constant zooming in and out when examining multiple places-of-interest with limited screen space (e.g. on mobile devices or in a UI widget).</p><p>The hardest part of the project was to interactively and dynamically generate the boundaries of each sub-map.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=56096209dce2" width="1" height="1" alt=""><hr><p><a href="https://medium.com/vis-gl/vis-hackathon-2019-56096209dce2">Vis Hackathon 2019</a> was originally published in <a href="https://medium.com/vis-gl">vis.gl</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Kepler.gl: Export Map]]></title>
            <link>https://medium.com/vis-gl/kepler-gl-export-map-c3c0804e55b8?source=rss----7d5390e55872---4</link>
            <guid isPermaLink="false">https://medium.com/p/c3c0804e55b8</guid>
            <category><![CDATA[maps]]></category>
            <category><![CDATA[javascript]]></category>
            <category><![CDATA[keplergl]]></category>
            <dc:creator><![CDATA[Giuseppe Macrì]]></dc:creator>
            <pubDate>Tue, 23 Jul 2019 05:13:20 GMT</pubDate>
            <atom:updated>2019-07-23T05:13:20.851Z</atom:updated>
            <content:encoded><![CDATA[<p>Developed by Uber and contributed to the LF Urban Computing Foundation, <a href="https://kepler.gl/">Kepler.gl</a> is an open source geospatial analysis tool for large-scale data sets.</p><p>I previously posted two articles where I described some of the application features. Both articles are the following:</p><ul><li><a href="https://medium.com/vis-gl/animating-40-years-of-california-earthquakes-e4ffcdd4a289">Animating 40 years of California earthquakes</a></li><li><a href="https://medium.com/vis-gl/kepler-gl-dropbox-map-save-share-b4a41a75715b">Kepler.gl + Dropbox = Save and share your maps</a></li></ul><p>In this article we are going to describe the <strong>export map </strong>feature in<a href="https://kepler.gl/"> Kepler.gl</a>.</p><h3>Let’s create a simple map</h3><p>In order to export our map, you first need to create a map. <a href="https://kepler.gl/">Kepler.gl website</a> has some predefined maps:</p><ul><li>visit <a href="https://kepler.gl/demo/earthquakes">Kepler.gl earthquake demo</a></li></ul><p>The new map will look like the screenshot below, showing all California earthquakes over the past 40 years.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*2pRz9-CUi2pLukHptH3sZQ.png" /><figcaption>California Earthquake Map</figcaption></figure><p>Once you are happy with the map and all the layers you can create on it (<a href="https://github.com/keplergl/kepler.gl/blob/master/docs/user-guides/a-introduction.md">user guide</a>), let’s move on to exporting the current visualization by clicking on the share button (top left corner),</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*jgiib-mg-cYWruz9pqf_jg.png" /><figcaption>Highlighted Share Button</figcaption></figure><p>upon clicking the <strong>Share</strong> button, you will be provided with a list of options as shown below.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*QPir4i0niw07Eq6VPfaymg.png" /><figcaption>Export Options</figcaption></figure><p>Click on <a href="https://github.com/keplergl/kepler.gl/blob/master/docs/user-guides/k-save-and-export.md#export-map"><strong>Export Map</strong></a>.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*ye0GHFG4ackVFrm483gEUQ.png" /></figure><p>After clicking on Export Map, <a href="https://kepler.gl/">Kepler.gl</a> will show a modal dialog where you can select the format of your exported map. As shown in the following screenshot</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*l3vH3B9p6g2rHmyX1jjdJg.png" /><figcaption>Export modal dialog</figcaption></figure><p><a href="https://kepler.gl/">Kepler.gl</a> provides two different options:</p><ul><li>HTML: create a single HTML file of your map with data and configuration.</li><li>JSON: create a JSON file with your current map data and config. You can later load it back to kepler.gl</li></ul><h3>Export Map as HTML</h3><p>The default export map option is HTML, see HTML export modal dialog image. As part of HTML export options, you can provide a Mapbox token to be saved in the HTML file.</p><p>If you don’t provide a Mapbox token, Kepler.gl will use a temporary one which can expire later. We recommend to use your own Mapbox token.</p><p>If you don’t have an access token when you export the map, you can always <a href="https://docs.mapbox.com/help/how-mapbox-works/access-tokens/">create a new token</a> and <a href="https://github.com/keplergl/kepler.gl/blob/master/docs/user-guides/k-save-and-export.md#how-to-update-an-exported-map-token">update the generated HTML</a> file later one when necessary.</p><p>Data and configuration will be inlined in the single HTML file. You can use it as a static page and share it with other people.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*qEUWpQSCPZ28aG-RxFTHZA.gif" /><figcaption>Export to html</figcaption></figure><blockquote>The generate HTML file can be pretty big since it contains all map data.</blockquote><h3>Export Map as JSON</h3><p>JSON is the second available option to export your map. Once you click on JSON button, Kepler.gl will show the following modal window.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*EH5qLh4fOcCz-Fd1pZbnHw.png" /><figcaption>Export JSON Map</figcaption></figure><p>The lower part of the modal dialog shows the current map configuration.</p><blockquote><em>The map configuration is coupled with loaded datasets; each part of the configuration will contain a </em><strong><em>dataId</em></strong><em> which will refer to one of the loaded datasets.</em></blockquote><p>Once you click on Export, Kepler.gl will create a JSON file containing both map configuration and data. The newly generated file can be used to recreate our original map. See below</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*p4wDwWu6IKl1GFXWZbkv6Q.gif" /><figcaption>Export to JSON</figcaption></figure><blockquote>The generate JSON file can be pretty big since it contains all map data.</blockquote><h3>Conclusions</h3><p>In this post we reviewed the Kepler.gl export map functionality to create an HTML or JSON file.</p><p>HTML export can be really useful if you want a static representation of our map to include on your website, share it with other users or visualizing it offline on our computer.</p><p>JSON is a lean representation of our map and the file can be stored in a CDN, share with other people and load it later to reproduce the exact same map.</p><p>Exporting map and data is part of the core abilities Kepler.gl provides and we are continuing working on improving and adding new features.</p><p>If you want to know more about Kepler.gl export features, check the <a href="https://github.com/keplergl/kepler.gl/blob/master/docs/user-guides/k-save-and-export.md">user guides</a>.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=c3c0804e55b8" width="1" height="1" alt=""><hr><p><a href="https://medium.com/vis-gl/kepler-gl-export-map-c3c0804e55b8">Kepler.gl: Export Map</a> was originally published in <a href="https://medium.com/vis-gl">vis.gl</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Introducing kepler.gl for Jupyter]]></title>
            <link>https://medium.com/vis-gl/introducing-kepler-gl-for-jupyter-f72d41659fbf?source=rss----7d5390e55872---4</link>
            <guid isPermaLink="false">https://medium.com/p/f72d41659fbf</guid>
            <category><![CDATA[geospatial]]></category>
            <category><![CDATA[open-source]]></category>
            <category><![CDATA[keplergl]]></category>
            <category><![CDATA[jupyter-notebook]]></category>
            <category><![CDATA[data-science]]></category>
            <dc:creator><![CDATA[Shan He]]></dc:creator>
            <pubDate>Tue, 25 Jun 2019 17:24:32 GMT</pubDate>
            <atom:updated>2019-06-25T17:24:32.153Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*xqaNw3bbVtYk0TbIVG0rQg.png" /></figure><p><a href="http://kepler.gl">kepler.gl</a> is an advanced geospatial visualization tool open sourced by Uber’s Visualization team in 2018 and contributed to the <a href="https://www.linuxfoundation.org/projects/urban-computing/">Urban Computing Foundation</a> earlier this year. At Uber, kepler.gl is the defacto tool for geospatial data analysis.</p><p>In order to help data scientists work more effectively, we integrated kepler.gl into many widely used data analysis platforms, now including Jupyter Notebook. Jupyter Notebook is a popular open source web application used to create and share documents that contain live code, equations, visualizations, and text, commonly used among data scientists to conduct data analysis and share results. At Uber, data scientists have utilized this integration to analyze multitudes of geospatial data collected through the app, in order to better understand how people use Uber, and how to improve their trip experience. Now, everyone can leverage kepler.gl within Jupyter Notebook.</p><p>We integrated<a href="https://pypi.org/project/keplergl/"> kepler.gl as a Jupyter Widget</a>. It loads kepler.gl inside a notebook cell, allowing users to quickly plot maps with simple python commands and interact with the UI to customize the visualization (Figure 1). It provides a seamless analysis workflow, combining data querying, transformation, analysis, and visualization — all inside Jupyter Notebook.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*h6A3KSr7WjMychm7OIau-g.png" /><figcaption>Figure 1. Using kepler.gl in Jupyter Notebook to visualize geospatial data</figcaption></figure><h3>How to use the kepler.gl widget on Jupyter Notebook</h3><p><strong>You can find the complete </strong><a href="https://github.com/keplergl/kepler.gl/blob/master/docs/keplergl-jupyter/user-guide.md"><strong>user guide</strong></a><strong> and </strong><a href="https://github.com/keplergl/kepler.gl/blob/master/docs/keplergl-jupyter/user-guide.md#demo-notebooks"><strong>demo notebooks</strong></a><strong> in the </strong><a href="https://github.com/keplergl/kepler.gl"><strong>kepler.gl repo</strong></a><strong> documentation folder.</strong></p><p>First, install keplergl with pip.</p><pre>$ pip install keplergl</pre><p>Launch Jupyter Notebook either on your local machine or on the server.</p><pre>$ jupyter notebook</pre><p>Load the kepler.gl widget with the command below, and an empty kepler.gl map will be loaded below the cell (Figure 2.). You can use the `height` parameter to define window size.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/277683e98008e5b053d54ff2e236d4e6/href">https://medium.com/media/277683e98008e5b053d54ff2e236d4e6/href</a></iframe><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*q7HpoiYfn1kKoWlD4Slvtw.png" /><figcaption>Figure 2. Load an empty kepler.gl map</figcaption></figure><p>Now, let’s add data to the map. Like the kepler.gl app, the kepler.gl widget supports CSV and GeoJSON. In addition, it also supports <a href="http://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.html">Pandas DataFrame</a> and <a href="http://geopandas.org/reference/geopandas.GeoDataFrame.html">GeoPandas GeoDataFrame</a>. For more on supported data formats, go to the <a href="https://github.com/keplergl/kepler.gl/blob/master/docs/keplergl-jupyter/user-guide.md#3-data-format">Data Format</a> section of the user guide. Call add_data To add data to map. Dataset is required to have a name. Name is the id of the dataset in kepler.gl config, and will be used to link layers and filters to it.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/74e5314b8121063e2bd5d97461d35781/href">https://medium.com/media/74e5314b8121063e2bd5d97461d35781/href</a></iframe><p>After the data is loaded into the map, you can use the side panel to edit the layers, filters and base map style just like how you would with kepler.gl (Figure 3).</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/781/1*Ht-mNIfvv0R1KzlzgTnrOQ.gif" /><figcaption>Figure 3. Interact with kepler.gl to edit layers and filters</figcaption></figure><p>When you are happy with the final result, print out the current map configuration using .config and save it for later to be used as a template (Figure 4).</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*-2xmDQ_XXL1Fc1S-VbYTtg.jpeg" /></figure><p>When calling keplergl.KeplerGl(), you can pass in config and data values to initialize a predefined map. Pay attention to the name of the data and dataId saved in layer and filter config. They need to match each other for the config to be applied. Read more about <a href="https://github.com/keplergl/kepler.gl/blob/master/docs/keplergl-jupyter/user-guide.md#6-match-config-with-data">match config with data</a>.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/7ae2d634e02d740982dcdaa197151dd6/href">https://medium.com/media/7ae2d634e02d740982dcdaa197151dd6/href</a></iframe><p>kepler.gl also supports saving the map as an interactive HTML document with save_to_html.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/dbfb90f4dda89075f707a6e616088955/href">https://medium.com/media/dbfb90f4dda89075f707a6e616088955/href</a></iframe><p>kepler.gl map data and config are saved to iPython widget state. This allows the notebook file to be rendered with rendered map. Unfortunately, widget state is not automatically saved when the kernel shuts down. Which means in order to load the map after restarting the kernel, you need to manually save notebook widget state before shutting down the kernel (Figure 5).</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/680/1*VAJbejWPdcQorxqD7CYhAA.png" /><figcaption>Figure 5. Save widget state to preserve map</figcaption></figure><p>That’s it! This is all you need to know to use kepler.gl inside Jupyter Notebook. You can submit <a href="https://github.com/keplergl/kepler.gl/issues/new?assignees=heshan0131&amp;labels=jupyter&amp;template=bug-report--jupyter-widget.md&amp;title=%5BBug%5D%5BJupyter+Widget%5D">bug</a> and <a href="https://github.com/keplergl/kepler.gl/issues/new?assignees=heshan0131&amp;labels=jupyter&amp;template=feature-request--jupyter-widget.md&amp;title=">feature request</a> using our <a href="https://github.com/keplergl/kepler.gl/issues/new/choose">GitHub template</a>. Make sure to share your maps and notebooks built with kepler.gl widget on Twitter with <a href="https://twitter.com/search?q=%23keplergl&amp;src=typd">#keplergl</a> or tag <a href="https://twitter.com/heshan_cheri">@heshan_cheri</a>.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=f72d41659fbf" width="1" height="1" alt=""><hr><p><a href="https://medium.com/vis-gl/introducing-kepler-gl-for-jupyter-f72d41659fbf">Introducing kepler.gl for Jupyter</a> was originally published in <a href="https://medium.com/vis-gl">vis.gl</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Uber’s vis.gl brings glTF to geospatial data visualization]]></title>
            <link>https://medium.com/vis-gl/ubers-vis-gl-brings-gltf-to-geospatial-data-visualization-86208ddec91d?source=rss----7d5390e55872---4</link>
            <guid isPermaLink="false">https://medium.com/p/86208ddec91d</guid>
            <category><![CDATA[javascript]]></category>
            <category><![CDATA[geospatial]]></category>
            <category><![CDATA[webgl]]></category>
            <category><![CDATA[3d]]></category>
            <category><![CDATA[gltf]]></category>
            <dc:creator><![CDATA[Georgios Karnas]]></dc:creator>
            <pubDate>Tue, 18 Jun 2019 20:11:56 GMT</pubDate>
            <atom:updated>2019-06-18T20:36:23.802Z</atom:updated>
            <content:encoded><![CDATA[<blockquote>The following was originally posted to the Khronos® Group’s blog “<a href="https://www.khronos.org/blog/ubers-vis.gl-brings-gltf-to-geospatial-data-visualization">Uber’s vis.gl brings glTF to geospatial data visualization</a>”</blockquote><blockquote><strong><em>Authored by</em></strong><em> Georgios Karnas, Ib Green, Travis Gorkin and Xintong Xia</em></blockquote><p>In 2016, the Uber Visualization team released an open source version of <a href="https://deck.gl/">deck.gl</a> and <a href="https://luma.gl/">luma.gl</a>, two Khronos Group <a href="https://www.khronos.org/webgl/">WebGL™</a>-powered frameworks for visualizing and exploring huge geospatial data sets on maps. Since then, the technology has flourished into a full-fledged suite of over a dozen open source WebGL and GPGPU data visualization libraries and tools, known collectively as <a href="https://vis.gl/">vis.gl</a>. <a href="https://loaders.gl/">loaders.gl</a>, the newest addition to the vis.gl family, adds support for loading and rendering <a href="https://www.khronos.org/gltf/">glTF™</a> assets across the tech stack. This unlocks the ability to include rich 3D content within data visualization applications built using luma.gl and deck.gl, enabling a variety of interesting new use cases. In this post, we’ll show some applications and walk through how you can use deck.gl and glTF, Khronos’ open standard 3D file format, to quickly create a geospatial data visualization that renders tens of thousands of 3D models.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1008/1*FRB7JYIcGkHJ2WwW1ZmQRA.gif" /><figcaption>glTF model in <a href="http://deck.gl/">deck.gl</a> (playback mode in <a href="https://avs.auto/">AVS.auto</a>). “Vision meets Robotics: The KITTI Dataset” licensed under CC BY-NC-SA 3.0. “<a href="https://sketchfab.com/3d-models/lexus-is-f-58fbbe92ea684d248dbff66ceb089816">Lexus IS F 3D Model</a>” by Yo.Ri licensed under CC BY 4.0.</figcaption></figure><h3>The Case for glTF</h3><p>Prior to glTF, it would not have been practical to add support for high-quality 3D models to a visualization framework like deck.gl. Being able to mix 3D models into visualizations is, of course, appealing, but the work involved would have been too big and difficult to prioritize, as data visualizations are typically more focused on abstract visualization types like scatterplots, lines, and extruded geometric primitives.</p><p>However, glTF has quickly risen to become a major format for 3D assets in the WebGL world: glTF is supported by many major companies and products; online model marketplaces support downloading models in glTF; many popular tools support importing and exporting models in glTF, including the open-source Blender. Additionally, glTF models can now be loaded into a growing number of WebGL frameworks, as well as opened in the Windows Explorer and dropped right into in the Facebook feed. glTF is not only a format but an ecosystem, and we wanted to let deck.gl users leverage the benefits of that ecosystem.</p><p>A major upside of the wide adoption of glTF is that the same models can now be loaded into most major WebGL frameworks. Thanks to a well-defined, modern Physically Based Rendering (PBR) material and lighting model, glTF assets render identically (and look spectacular) regardless of where they are loaded.</p><p>By following the clear and concise glTF specification, building the Khronos reference PBR shader implementation, and consulting many invaluable open source resources, deck.gl was able to relatively quickly implement support for importing and rendering high-quality 3D models at a quality which is at parity with the best gaming focused WebGL frameworks.</p><p>To us, this is a testament to the power of glTF and a major example of the benefits of open, royalty-free, and thoughtfully-defined standards within the web and computer graphics industries.</p><h3>glTF for Data Visualization</h3><p>Earlier this year, the Uber Visualization team, in partnership with Uber’s Advanced Technology Group (ATG), released the open source <a href="https://avs.auto/">AVS.auto</a> system. AVS.auto leverages the vis.gl tech stack for visualizing and exploring autonomous vehicle data, such as point clouds from LIDAR sensors and predicted vehicle trajectories. AVS.auto and deck.gl are being adopted across a wide range of autonomous companies to build tools that render increasingly realistic scenes containing real-world objects, like vehicles, bikes, pedestrians, and other street/city features.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*MIh0416tNPoRk_K2lrffPQ.jpeg" /><figcaption>A glTF Model loaded into <a href="https://avs.auto/">AVS.auto</a>, integrated with LIDAR point clouds and perception based visuals.</figcaption></figure><p>In our own tools, we use the visual difference between realistic vs. abstract visual elements as part of our design vocabulary. We use more realistic designs (such as glTF models) for real world objects and contrast that with abstract “Tron”-like graphics that show paths, trajectories, and point clouds to represent what the autonomous system is seeing, predicting, or planning to do. In addition, the geospatial features of deck.gl enable glTF models to be correctly positioned on top of base maps and visually interleaved with deck.gl’s standard abstract 3D visual elements.</p><p>Another interesting use case involves replacing abstract visual elements (such as points, lines, or arcs) in deck.gl visualizations with instances of glTF models that represent something concrete about the data.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*LUN3NWHyz0SUUDcB8jxOzQ.jpeg" /><figcaption>Instanced Rendering of glTF models as visual markers on a base map in <a href="https://kepler.gl/">kepler.gl</a></figcaption></figure><p>deck.gl now offers a <a href="https://deck.gl/#/documentation/deckgl-api-reference/layers/scenegraph-layer">ScenegraphLayer</a> that accepts a glTF scenegraph. Like all deck.gl layers, the ScenegraphLayer also accepts a table of data, and each row in the supplied data table is mapped to one instance of the glTF model during rendering, with offset, rotation, and scale extracted from each table row. The ScenegraphLayer uses instanced rendering on the GPU (the same technique that is used to performantly render large numbers of similar soldiers, vegetation elements, etc. in games), which enables remarkably performant rendering of thousands of identical models.</p><h3>How to Use glTF in luma.gl, deck.gl and nebula.gl</h3><p>It is very easy to create a glTF layer in deck.gl; all you have to do is import <em>ScenegraphLayer</em> and pass the proper parameters. Here is an example that allows you to place glTF Meshes in NYC: in our case, avocados. This could represent places where you can buy good produce or fruits.</p><p>This example leverages <a href="https://nebula.gl/">nebula.gl</a>, another member of the vis.gl family that provides high performance, interactive editing of deck.gl layers. The result demonstrates a geospatial glTF editor example in less than a page of code:</p><p><a href="https://codesandbox.io/s/gltf-nebulagl-l7zk1rl79m?fontsize=10">glTF-nebula.gl - CodeSandbox</a></p><p><em>You can move the map or rotate it using Option+Drag. Notice that you can have a 360-degree view of the loaded model as it rotates with the map!</em></p><h3>The Future of glTF and vis.gl</h3><p>Our frameworks have benefitted from the work done by many others on the glTF standard, and we want to give something back to the community. As a first step, we have open-sourced the framework-independent glTF and Draco loaders that we developed to support glTF in deck.gl.</p><p>These loaders became foundational building blocks for our new companion framework <a href="https://loaders.gl/"><strong>loaders.gl</strong></a>. This framework provides a growing suite of loaders for important 3D, geospatial, and visualization focused formats, including glTF and Draco.</p><p>The <strong>GLTFLoader </strong>in loaders.gl loads all the Khronos reference models and supports all incarnations of the glTF formats (binary GLB format, base-64 encoded JSON, and JSON with linked binary files) and also supports decoding of Draco compressed meshes. Support for additional extensions (like KHR_lights_punctual) is in progress.</p><p><em>Khronos, EGL, glTF, NNEF, OpenVG, OpenVX, OpenXR, SPIR, SPIR-V, SYCL, Vulkan and WebGL are trademarks or registered trademarks of The Khronos Group Inc. OpenCL is a trademark of Apple Inc. and OpenGL and OpenML are registered trademarks and the OpenGL ES and OpenGL SC logos are trademarks of Hewlett Packard Enterprise used under license by Khronos. All other product names, trademarks, and/or company names are used solely for identification and belong to their respective owners.</em></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=86208ddec91d" width="1" height="1" alt=""><hr><p><a href="https://medium.com/vis-gl/ubers-vis-gl-brings-gltf-to-geospatial-data-visualization-86208ddec91d">Uber’s vis.gl brings glTF to geospatial data visualization</a> was originally published in <a href="https://medium.com/vis-gl">vis.gl</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Using deck.gl with Google Maps]]></title>
            <link>https://medium.com/vis-gl/using-deck-gl-with-google-maps-9c868d18e3cd?source=rss----7d5390e55872---4</link>
            <guid isPermaLink="false">https://medium.com/p/9c868d18e3cd</guid>
            <category><![CDATA[visualization]]></category>
            <category><![CDATA[webgl]]></category>
            <category><![CDATA[google-maps]]></category>
            <category><![CDATA[javascript]]></category>
            <dc:creator><![CDATA[Xiaoji Chen | 消极]]></dc:creator>
            <pubDate>Tue, 07 May 2019 23:31:02 GMT</pubDate>
            <atom:updated>2019-05-07T23:31:01.796Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*nPyFQIRLJNQWFVfuS38KhQ.jpeg" /><figcaption>deck.gl ScatterplotLayer and ArcLayer over Google Maps (data source: Natural Earth)</figcaption></figure><p><a href="http://bl.ocks.org/Pessimistress/b5e9b86cd0f69ea69f20150734d4e2b6">Live demo</a></p><p>deck.gl was originally created to work with Mapbox GL JS. All of the examples on its website still use Mapbox as the base map. Yet, since we first open-sourced the framework, we’ve been asked on GitHub: how do you use it with Google Maps?</p><p>Both map solutions use <a href="https://en.wikipedia.org/wiki/Web_Mercator_projection">Web Mercator Projection</a>, so the same camera settings should (for the most part) yield pixel-matching views. In practice, we found there was more to it than that, so we worked with the Google Maps Platform engineering team to make it happen.</p><h4><strong>So what was the issue</strong>?</h4><p>The largest issue that delayed adding Google Maps support to deck.gl was camera synchronization. For example, when a user zooms in and out of a Google map with the scroll wheel, Google Maps always settles on an integer zoom level, and applies a smooth transition between the current and new zoom levels. For deck.gl’s visualization layers to look as a seamless part of the Google map, deck.gl must redraw its canvas on every transition frame to match itself with the base map.</p><p>To achieve this, we had two options:</p><ul><li><strong>Match the deck.gl camera with that of the base map.</strong> This requires us to listen to a camera event of the map component and update the deck.gl view state on every change. But there’s a big problem with this approach: in Google Maps, the zoom_changed event is fired only once when zoom happens, and the map.getZoom method only ever returns the target (integer) zoom level. This means there’s no way for deck.gl to stay synchronized with Google’s transition animation between zoom levels.</li><li><strong>Match the base map’s camera with that of deck.gl.</strong> This approach would require disabling the default interaction of Google Maps, and use deck.gl’s map controller to drive all camera changes. Bad news: we can’t do that either because the map.setZoom function of the Google Maps JS API doesn’t accept fractional values, meaning there is no way for external code to plug in a smooth transition logic.</li></ul><p>So neither of these options was viable. From the API perspective, Google Maps JS API doesn’t offer the fine-grain callbacks, getters and/or setters that deck.gl normally uses for camera synchronization. Additionally, we had no visibility into the animation curve used to control the camera in Google Map, so we couldn’t hack our end to produce the same look. It started to look like Google Maps support wouldn’t be possible.</p><h4>The solution</h4><p>While working on deck.gl v7.0, we partnered with developers from Google to restart the investigation into how deck.gl can work seamlessly with Google Maps. Their familiarity with the Maps API gave us the eventual breakthrough. The solution involved extending the <a href="https://developers.google.com/maps/documentation/javascript/reference/#OverlayView">OverlayView</a> API and reverse-engineering the projection matrix as well as some CSS. If you are curious, <a href="https://github.com/uber/deck.gl/blob/7.0-release/modules/google-maps/src/utils.js#L84">here is how we did it</a>.</p><p>We figured you probably don’t need to know the details, so we packaged it up and offered it as a new npm module <a href="http://deck.gl/#/documentation/submodule-api-reference/deckgl-google-maps/google-maps-overlay">@deck.gl/google-maps</a>. To use this module, you construct a GoogleMapsOverlay instance:</p><pre>import {GoogleMapsOverlay} from &#39;@deck.gl/google-maps&#39;;</pre><pre>const overlay = new GoogleMapsOverlay({<br>  // Deck props<br>});</pre><p>And then simply use it as any other Maps overlay:</p><pre>overlay.setMap(map);</pre><p>Here is what the code would look like:</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/1dfe2879c35e5c1537e0ae5b688597b9/href">https://medium.com/media/1dfe2879c35e5c1537e0ae5b688597b9/href</a></iframe><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*ueIWauR6Ue4nCJGeR4ILEg.jpeg" /><figcaption>Mapping 203K trees in Paris using Google Maps and deck.gl’s ScatterplotLayer. Data source: <a href="https://opendata.paris.fr/">https://opendata.paris.fr/</a></figcaption></figure><p><a href="https://bl.ocks.org/Pessimistress/raw/2b2dee8d3aa01d31bb97ca2671690af9/7135d4d7eef3fd36e5a245b0551dab1f7ff4ef70/">Live demo</a></p><p>All deck.gl layers and picking functionalities (e.g. onHover and onClick) work with the Google Maps integration.</p><p>The Google Maps JavaScript library does not support free rotation of the camera, so sadly the 3D layers of deck.gl may not work very well. deck.gl does not currently support zoom levels below zero, so it will hide itself when the user zooms all the way out.</p><h4>What’s next</h4><p>Because we don’t use Google Maps for internal apps at Uber, this new module might not be as battle-tested as the rest of the framework. We ask external developers to kindly provide us feedback if anything can be improved.</p><p>This is not the end of our work with Google Maps. The Google Maps API team plan on offering more of their WebGL infrastructure through the JS API and are committed to work closely with us as they bring their 3D rendering to their external developers. We love the richness of Google’s map data and its beautiful visuals. I would personally die to put my data in the same 3D space as its building models and street view.</p><p>Big thanks to everyone who worked on this feature: <a href="https://github.com/MeTaNoV">Pascal Gula</a>, <a href="https://github.com/ibgreen">Ib Green</a>, <a href="https://github.com/donmccurdy">Don McCurdy</a>, and Travis McPhail.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=9c868d18e3cd" width="1" height="1" alt=""><hr><p><a href="https://medium.com/vis-gl/using-deck-gl-with-google-maps-9c868d18e3cd">Using deck.gl with Google Maps</a> was originally published in <a href="https://medium.com/vis-gl">vis.gl</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Introducing deck.gl v7.0]]></title>
            <link>https://medium.com/vis-gl/introducing-deck-gl-v7-0-c18bcb717457?source=rss----7d5390e55872---4</link>
            <guid isPermaLink="false">https://medium.com/p/c18bcb717457</guid>
            <category><![CDATA[webgl]]></category>
            <category><![CDATA[visualization]]></category>
            <category><![CDATA[javascript]]></category>
            <category><![CDATA[open-source]]></category>
            <dc:creator><![CDATA[Xiaoji Chen | 消极]]></dc:creator>
            <pubDate>Fri, 26 Apr 2019 18:59:07 GMT</pubDate>
            <atom:updated>2019-04-27T07:37:03.423Z</atom:updated>
            <content:encoded><![CDATA[<p><em>by Uber Visualization Frameworks Team</em></p><p>Since deck.gl’s debut as an open-source framework designed for <a href="https://kepler.gl">geospatial analytics</a>, we have continued to build boundary-pushing WebGL-powered applications at Uber, such as <a href="https://avs.auto">autonomy visualization</a>, <a href="https://neb.gl">map editing</a>, and <a href="https://eng.uber.com/manifold/">machine learning</a>. We have also seen a tremendous amount of enthusiasm from our open-source community, and learned about the unique use cases and challenges that each of our users face.</p><p>This week, we are excited to announce a new deck.gl milestone: <a href="http://deck.gl/#/documentation/overview/whats-new">v7.0</a>. This is a major step forward for the framework, with more powerful visuals, a more extensible architecture, better support for industry standards, and an easier learning curve. The internal architecture has been substantially rewritten to support a multitude of new features planned for the months to come.</p><h3>A Growing Layer Catalog</h3><p>In this release, we have added 11 new layers to deck.gl’s official layer catalog to make common scenarios easier to implement.</p><h4><a href="http://deck.gl/#/documentation/deckgl-api-reference/layers/tile-layer"><strong>TileLayer</strong></a></h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/512/1*IwASLkDgaFdBIPqsDMeXAg.gif" /><figcaption><em>deck.gl rendering OpenStreetMap raster tiles with the TileLayer</em></figcaption></figure><p>First introduced in deck.gl v6.3, the TileLayer has graduated from experimental status. This layer loads and renders data within the current viewport using the <a href="https://wiki.openstreetmap.org/wiki/Slippy_map_tilenames#Resolution_and_Scale">OSM tile indexing system</a>. The data source, format and rendering can all be customized. It demonstrates how large datasets can be spatially-indexed and interactively loaded based on the camera. You can find examples that render <a href="http://deck.gl/#/examples/core-layers/tile-layer">raster map tiles</a> and <a href="http://deck.gl/#/documentation/deckgl-api-reference/layers/tile-layer">vector map tiles</a> on the website.</p><h4><a href="http://deck.gl/#/documentation/deckgl-api-reference/layers/bitmap-layer"><strong>BitmapLayer</strong></a></h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/512/1*07hv5-vxNAJbIGG1mXVtlw.png" /><figcaption><em>deck.gl overlaying a custom image over San Francisco with the BitmapLayer</em></figcaption></figure><p>The BitmapLayer places a rectangular image in world space, with options for simple image filtering effects. It is especially useful when composing vector data with satellite imagery or raster scans of the environment. Check out the <a href="http://deck.gl/showcases/gallery/bitmap-layer">weather system example</a> for how to use this layer.</p><h4><a href="http://deck.gl/#/documentation/deckgl-api-reference/layers/column-layer"><strong>ColumnLayer</strong></a></h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/512/1*36HqEjwLU25Q7WY9eMOVcw.png" /><figcaption><em>deck.gl showing the distribution of bike racks in San Francisco with the ColumnLayer</em></figcaption></figure><p>The ColumnLayer is a generalized replacement of the HexagonCellLayer and GridCellLayer from deck.gl v6. It renders a cylinder at each given position with specified height and color, useful for constructing 3D bar-charts that visualize density.</p><h4><a href="http://deck.gl/#/documentation/deckgl-api-reference/layers/trips-layer"><strong>TripsLayer</strong></a></h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/512/1*bxjOxQsSO4791dmS5pACDA.gif" /><figcaption><em>deck.gl visualizing a vehicle’s movement with the TripsLayer</em></figcaption></figure><p>The TripsLayer, as can be seen on the deck.gl <a href="https://deck.gl">home page</a>, has also graduated from experimental status. Support for customizable line width, a much-requested feature, has been restored.</p><h4><a href="http://deck.gl/#/documentation/deckgl-api-reference/layers/s2-layer"><strong>S2Layer</strong></a><strong> and </strong><a href="http://deck.gl/#/documentation/deckgl-api-reference/layers/h3-hexagon-layer"><strong>H3 Layers</strong></a></h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*NNFjaF_Oxlw2-Zdm" /><figcaption><em>deck.gl’s S2Layer, H3HexagonLayer and H3ClusterLayer</em></figcaption></figure><p>Popular spatial indexing systems such as <a href="http://s2geometry.io/">S2</a> and <a href="https://uber.github.io/h3/#/documentation/core-library/overview">H3</a> divide the surface of the earth into linearly indexed grid cells. Geospatial data can be efficiently aggregated, filtered and queried within these systems. The new S2Layer and H3 layers make it easy to visualize datasets that use the S2 or H3 indexes.</p><p>There are more! Check out our <a href="http://deck.gl/#/documentation/overview/whats-new">what’s new</a> page for a full list.</p><h3>Loaders.gl and Mesh Layers</h3><p>In v7.0, we added a new category of layers: the mesh layers, which render an arbitrary 3D object at each given position with a specified size, orientation, and color.</p><p>To facilitate the loading of 3D models into these layers, we have released a new library — <a href="https://uber-web.github.io/loaders.gl/#/docs/overview/introduction">loaders.gl</a>, a new companion to deck.gl. Loaders.gl is an open-source collection of framework-agnostic loaders for industry-standard 3D file formats such as OBJ, PLY and glTF. All loaders in this suite output to the same consistent, standardized format.</p><p>The <a href="http://deck.gl/#/documentation/deckgl-api-reference/layers/simple-mesh-layer">SimpleMeshLayer</a> is great for rendering simple 3D models in formats such as OBJ and PLY (<a href="https://github.com/uber/deck.gl/tree/7.0-release/examples/website/mesh">example</a>).</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/800/1*AwDoGv5bPCwZabYBXS8UAg.png" /><figcaption><em>deck.gl rendering a fleet of Mini Coopers from OBJ with the SimpleMeshLayer</em></figcaption></figure><p>Most notably, we are adding initial integration with the popular <a href="https://www.khronos.org/gltf/">glTF™</a> asset format. glTF is a royalty-free specification for the efficient transmission and loading of 3D assets, with a rich ecosystem of tools and extensions. All variants of glTF 2.0 are supported, including binary .glb files as well as JSON .gltf files with binary assets in base64 encoding or in separate files.</p><p>A full scene graph can be rendered with the <a href="http://deck.gl/#/documentation/deckgl-api-reference/layers/scenegraph-layer">ScenegraphLayer</a> — a more advanced version of the SimpleMeshLayer — with experimental support for PBR (physically based rendering) materials, lights and animations.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/637/1*3bcB8NDIRn7QYwzToV-X0g.gif" /><figcaption><em>deck.gl showing a horde of animated ducks from glTF with the SenegraphLayer</em></figcaption></figure><h3>Lighting Effect</h3><p>A new effects system has been rewritten from the ground up for v7.0. This will act as a foundation for many exciting visual effect features down the road. The first feature to take advantage of this new system is the LightingEffect — an easier, more comprehensive way to control the lighting for your layers.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/946/0*O7baCFsd-05mrwM6" /><figcaption><em>deck.gl showing lower Manhattan with the GeoJsonLayer, LightEffect and experimental ShadowEffect (shadow effect is WIP and not released as part of 7.0)</em></figcaption></figure><p>Light sources are now specified at a global level. All 3D layers will share the same light sources. Each layer now supports a new prop `material`, which defines how it reacts to lighting. The following light sources are supported as of 7.0:</p><ul><li><a href="http://deck.gl/#/documentation/deckgl-api-reference/lights/ambient-light">AmbientLight</a></li><li><a href="http://deck.gl/#/documentation/deckgl-api-reference/lights/point-light">PointLight</a></li><li><a href="http://deck.gl/#/documentation/deckgl-api-reference/lights/directional-light">DirectionalLight</a></li><li><a href="http://deck.gl/#/documentation/deckgl-api-reference/lights/camera-light">CameraLight</a> — a variation of the PointLight, always positioned at the camera.</li><li><a href="http://deck.gl/#/documentation/deckgl-api-reference/lights/sun-light">SunLight</a> — a variation of the DirectionalLight, automatically set based on a UTC time and the current viewport.</li></ul><p>See our <a href="http://deck.gl/#/documentation/developer-guide/using-lighting">developer guide</a> for usage examples.</p><h3>Google Maps Integration</h3><p>deck.gl has added experimental support for using Google Maps as the base map with the <a href="http://deck.gl/#/documentation/submodule-api-reference/deckgl-google-maps/overview">@deck.gl/google-maps</a> module.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*bzFqd8DVBU0lXA5s" /><figcaption><em>Using deck.gl over Google Maps</em></figcaption></figure><h3>Why Upgrade?</h3><p>In addition to the features highlighted above, we have also made dozens of incremental improvements to the existing API, including:</p><ul><li>First-class binary data support</li><li>Support for switching between pixel sizes and meter sizes</li><li>Support for toggling billboard mode in IconLayer and TextLayer</li><li>Projection and interaction bug fixes of OrthographicView and OrbitView for info-vis use cases</li><li>Improved picking performance</li><li>Improved React performance and synchronization with the base map</li><li>Improved dependency management to reduced version conflicts and bundle size</li></ul><p>Make sure to check out the <a href="http://deck.gl/#/documentation/overview/upgrade-guide">upgrade guide</a> page for deprecations and breaking changes.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=c18bcb717457" width="1" height="1" alt=""><hr><p><a href="https://medium.com/vis-gl/introducing-deck-gl-v7-0-c18bcb717457">Introducing deck.gl v7.0</a> was originally published in <a href="https://medium.com/vis-gl">vis.gl</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p>]]></content:encoded>
        </item>
    </channel>
</rss>