Google AR and VR

Auto Added by WPeMatico

Bring abstract concepts to life with AR expeditions

Bring abstract concepts to life with AR expeditions

Over the last three years, Google Expeditions has helped students go on virtual field trips to far-off places like Machu Picchu, the International Space Station and the Galapagos Islands. The more you look around those places in virtual reality (VR), the more you notice all the amazing things that are there. And while we’ve seen first hand how powerful a tool VR is for going places, we think augmented reality (AR) is the best way to learn more about the things you find there. Imagine walking around a life-sized African elephant in your classroom or putting a museum’s worth of ancient Greek statues on your table.

Last year at Google I/O we announced the Google Expeditions AR Pioneer Program, and over the last school year, one million students have used AR in their classrooms. With AR expeditions, teachers can bring digital 3D objects into their classrooms to help their students learn about everything from biology to Impressionist art.

Starting today, Expeditions AR tours are available to anyone via the Google Expeditions app on both Android and iOS. We’ve also updated the Expeditions app to help you discover new tours, find your saved tours, and more easily start a solo adventure. It’s never been easier to start a tour on your own, at home with your family or in the classroom.

AR takes the abstract and makes it concrete to the students. We wouldn’t be able to see a heart right on the desk, what it looks like when beating, and the blood circulating. Darin Nakakihara
Irvine Unified School District

Google Expeditions makes it easy to guide yourself or an entire classroom through more than 100 AR and 800 VR tours created by Google Arts & Culture partners like the Smithsonian Freer|Sackler, Museo Dolores Olmedo, and Smarthistory, as well as pedagogical partners like Houghton Mifflin Harcourt, Hodder Education a division of Hachette, Oxford University Press and Aquila Education.

Expeditions AR gif

Upgrade the Google Expeditions app now to try out AR expeditions with a compatible Android (ARCore) or iOS (ARKit) device. And starting today, interested schools can also purchase the first Expeditions AR/VR kits from Best Buy Education. Like VR, we believe AR can enhance the way we understand the world around us—it’s show-and-tell for a new generation.

Go to Source

Posted by amiller in Arts & Culture, Blog, Google AR and VR
Tour Creator: an easy way for businesses to create and share their own VR tours

Tour Creator: an easy way for businesses to create and share their own VR tours

Imagine touring a hotel before you book it, seeing the inside of your new office before your first day of orientation, or previewing an apartment before ever stepping foot inside. VR takes you places and lets you experience things that are otherwise too far away, expensive or even impossible to do in the real world. And now Tour Creator, which we launched last week at Google I/O, enables businesses to make their own VR experiences to reach both customers and employees.

Tour Creator was inspired by Google Expeditions, which has brought 3 million students and teachers around the world on virtual reality field trips. We quickly realized that virtual tours could extend far beyond the classroom into the fields of journalism, real estate, professional training and more.

Here are some of the ways businesses have been using Tour Creator through our beta program:

  • Moinian Group, a real estate company, is providing previews of their luxury apartments: “We’re always looking for easy ways for people to explore our offerings. Tour Creator has given us the ability to virtually allow an individual to immerse themselves in our spaces without physically seeing it,” says Michael Mignosi, Director of Marketing.

  • Time Out New York is bringing readers into the thick of the story: “Tour Creator is super useful for journalists because it creates an experience that is a lot more interactive and immersive than video or print by itself would be,” says Delia Barth, Content Producer.

  • Spectrum Designs, a nonprofit organization that employs adults with autism, is using Tour Creator to train new staff. Says Tim Howe, their COO: “An individual with autism can be very sensitive to sights and sounds. With Tour Creator, we can show them exactly what to expect their first day on the job.”

Tour Creator: Create VR Tours

All of these organizations have found it easy to create their own tour.  With Tour Creator you can use your own 360 photos, or if you’d prefer, you can choose an image from Google Street View’s extensive library.  It’s super easy to share your tour once you’re done by posting it to Poly, Google’s library of 3D content. Anyone can experience it with just a URL, so you can embed the Tour to your company’s website and your users can view through the browser or through a Google Cardboard to be immersed into the content.

To get more info or get started with your first tour, visit We’re looking forward to seeing how VR tours help your business!

Go to Source

Posted by amiller in Blog, Google AR and VR
Now students can create their own VR tours

Now students can create their own VR tours

Editor’s note: For Teacher Appreciation Week, we’re highlighting a few ways Google is supporting teachers—including Tour Creator, which we launched today to help schools create their own VR tours. Follow along on Twitter throughout the week to see more on how we’re celebrating Teacher Appreciation Week.

Since 2015, Google Expeditions has brought more than 3 million students to places like the Burj Khalifa, Antarctica, and Machu Picchu with virtual reality (VR) and augmented reality (AR). Both teachers and students have told us that they’d love to have a way to also share their own experiences in VR. As Jen Zurawski, an educator with Wisconsin’s West De Pere School District, put it: “With Expeditions, our students had access to a wide range of tours outside our geographical area, but we wanted to create tours here in our own community.”  

That’s why we’re introducing Tour Creator, which enables students, teachers, and anyone with a story to tell, to make a VR tour using imagery from Google Street View or their own 360 photos. The tool is designed to let you produce professional-level VR content without a steep learning curve. “The technology gets out of the way and enables students to focus on crafting fantastic visual stories,” explains Charlie Reisinger, a school Technology Director in Pennsylvania.

Once you’ve created your tour, you can publish it to Poly, Google’s library of 3D content. From Poly, it’s  easy to view. All you need to do is open the link in your browser or view in Google Cardboard. You can also embed it on your school’s website for more people to enjoy. Plus, later this year, we’ll add the ability to import these tours into the Expeditions application.

Tour Creator- Show people your world

Here’s how a school in Lancaster, PA is using Tour Creator to show why they love where they live.

“Being able to work with Tour Creator has been an awesome experience,” said Jennifer Newton, a school media coordinator in George. “It has allowed our students from a small town in Georgia to tell our story to the world.”

To build your first tour, visit Get started by showing us what makes your community special and why you #LoveWhereYouLive!

Go to Source

Posted by amiller in Blog, education, Google AR and VR
Experience augmented reality together with new updates to ARCore

Experience augmented reality together with new updates to ARCore

Three months ago, we launched ARCore, Google’s platform for building augmented reality (AR) experiences. There are already hundreds of apps on the Google Play Store that are built on ARCore and help you see the world in a whole new way. For example, with Human Anatomy you can visualize and learn about the intricacies of the nervous system in 3D. Magic Plan lets you create a floor plan for your next remodel just by walking around the house. And Jenga AR lets you stack blocks on your dining room table with no cleanup needed after your tower collapses.


As announced today at Google I/O, we’re rolling out a major update to ARCore to help developers build more collaborative and immersive augmented reality apps.

  • Shared AR experiences:Many things in life are better when you do them with other people. That’s true of AR too, which is why we’re introducing a capability called Cloud Anchors that will enable new types of collaborative AR experiences, like redecorating your home, playing games and painting a community mural—all together with your friends. You’ll be able to do this across Android and iOS.

Just a Line will be updated with Cloud Anchors, and available on Android & iOS in the coming weeks

  • AR all around you:ARCore now features Vertical Plane Detection which means you can place AR objects on more surfaces, like textured walls. This opens up new experiences like viewing artwork above your mantlepiece before buying it. And thanks to a capability called Augmented Images, you’ll be able to bring images to life just by pointing your phone at them—like seeing what’s inside a box without opening it.  
ARCore: Augmented Images
  • Faster AR development:With Sceneform, Java developers can now build immersive, 3D apps without having to learn complicated APIs like OpenGL. They can use it to build AR apps from scratch as well as add AR features to existing ones. And it’s highly optimized for mobile.


The New York Times used Sceneform for faster AR development

Developers can start building with these new capabilities today, and you can try augmented reality apps enabled by ARCore on the Google Play Store.

Go to Source

Posted by amiller in Blog, Google AR and VR
Google Lens: real-time answers to questions about the world around you

Google Lens: real-time answers to questions about the world around you

There’s so much information available online, but many of the questions we have are about the world right in front of us. That’s why we started working on Google Lens, to put the answers right where the questions are, and let you do more with what you see.

Last year, we introduced Lens in Google Photos and the Assistant. People are already using it to answer all kinds of questions—especially when they’re difficult to describe in a search box, like “what type of dog is that?” or “what’s that building called?”

Today at Google I/O, we announced that Lens will now be available directly in the camera app on supported devices from LGE, Motorola, Xiaomi, Sony Mobile, HMD/Nokia, Transsion, TCL, OnePlus, BQ, Asus, and of course the Google Pixel. We also announced three updates that enable Lens to answer more questions, about more things, more quickly:

First, smart text selection connects the words you see with the answers and actions you need. You can copy and paste text from the real world—like recipes, gift card codes, or Wi-Fi passwords—to your phone. Lens helps you make sense of a page of words by showing you relevant information and photos. Say you’re at a restaurant and see the name of a dish you don’t recognize—Lens will show you a picture to give you a better idea.  This requires not just recognizing shapes of letters, but also the meaning and context behind the words. This is where all our years of language understanding in Search help.

lens_menu_050718 (1).gif

Second, sometimes your question is not, “what is that exact thing?” but instead, “what are things like it?” Now, with style match, if an outfit or home decor item catch your eye, you can open Lens and not only get info on that specific item—like reviews—but see things in a similar style that fit the look you like.


Third, Lens now works in real time. It’s able to proactively surface information instantly—and anchor it to the things you see. Now you’ll be able to browse the world around you, just by pointing your camera. This is only possible with state-of-the-art machine learning, using both on-device intelligence and cloud TPUs, to identify billions of words, phrases, places, and things in a split second.

lens_multielements_050718 (1).gif

Much like voice, we see vision as a fundamental shift in computing and a multi-year journey. We’re excited about the progress we’re making with Google Lens features that will start rolling out over the next few weeks.

Go to Source

Posted by amiller in Blog, Google AR and VR
Introducing the first Daydream standalone VR headset and new ways to capture memories

Introducing the first Daydream standalone VR headset and new ways to capture memories

Back in January, we announced the Lenovo Mirage Solo, the first standalone virtual reality headset that runs Daydream. Alongside it, we unveiled the Lenovo Mirage Camera, the first camera built for VR180. Designed with VR capture and playback in mind, these devices work great separately and together. And both are available for purchase today.

More immersive

The Mirage Solo puts everything you need for mobile VR in a single device. You don’t need a smartphone, PC, or any external sensors—just pick it up, put it on, and you’re in VR in seconds.

The headset was designed with comfort in mind, and it has a wide field of view and an advanced display that’s optimized for VR. It also features WorldSense, a powerful new technology that enables PC-quality positional tracking on a mobile device, without the need for any additional sensors. With it, you can duck, dodge and lean, step backward, forward or side-to-side. All of this makes for a more natural and immersive experience, so you really feel like you’re there.


Lenovo Mirage Solo

With over 350 games, apps and experiences in the Daydream library, there’s tons to see and do. WorldSense unlocks new gameplay elements that bring the virtual world to life, and more than 70 of these titles make use of the technology, including Blade Runner: Revelations, Extreme Whiteout, Narrows, BBC Earth Live in VR, Fire Escape, Eclipse: Edge of Light, Virtual Virtual Reality, Merry Snowballs, and Rez Infinite. So whether you’re a gamer or an explorer, there’s something for everyone.

Point and shoot VR capture

Alongside the Mirage Solo, we worked with Lenovo to develop the first VR180 consumer camera, the Lenovo Mirage Camera. VR180 lets anyone capture immersive VR content with point and shoot simplicity. Photos and videos taken with the camera transport you back to the moment of capture with a 180 degree field of view and crisp, three-dimensional imagery.

There’s no better place to relive your VR180 memories than in the Lenovo Mirage Solo headset. And with support for VR180 built into Google Photos, you can easily share those moments with your friends and family—regardless of what device they have.


Lenovo Mirage Camera

We can’t wait for you to try out the Lenovo Mirage Solo and Mirage Camera to dive into new immersive experiences, and to start capturing your favorite moments in VR.

Go to Source

Posted by amiller in Blog, Google AR and VR
Behind the scenes: Coachella in VR180

Behind the scenes: Coachella in VR180

Last weekend, fans from all around the world made the trek to Southern California to see some of music’s biggest names perform at Coachella. To make those not at the festival feel like they were there, we headed to the desert with VR180 cameras to capture all the action.

Throughout the first weekend of Coachella, we embarked on one of the largest VR live streams to date, streaming more than 25 performances (with as many cameras to boot) across 20 hours and capturing behind-the-scenes footage of fans and the bands they love. If you missed it live, you can enjoy some of the best experiences—posted here.

VR180 can take you places you never thought possible—the front row at a concert, a faraway travel destination, the finals of your favorite sporting event, or a memorable location. This year at Coachella, we pushed the format even further by adding augmented reality, AR, overlays on top of the performances—like digital confetti that falls when the beat drops, or virtual objects that extend the into the crowd.


AR Palm Trees and Windmills in the VR180 stream.

To add these overlays in real time, we used our VR180 cameras together with stitching servers running a custom 3D graphics engine and several positionally tracked cameras. This allowed us to add a layer of spatially relevant visuals to the video feed. Simply put, it’s like AR stickers for VR180.

In addition to the responsive AR elements during performances, we also featured Tilt Brush art by artist-in-residence Cesar Ortega, who drew his live impressions of the iconic Coachella landscape at daylight, dusk and night. We then inserted Cesar’s designs into the video stream in VR180 to allow the viewer to see the art.


Cesar Ortega’s recreation of Coachella in Tilt Brush

Watch the festival footage, including performances and behind-the-scenes footage from the point of view of both the fans and the bands, here. And for the most immersive experience, check it out in VR with Daydream View or Cardboard.

Go to Source

Posted by amiller in Blog, Google AR and VR

How to publish VR180

Last year we introduced VR180, a new video format that makes it possible to capture or create engaging immersive videos for your audience. Most VR180 cameras work just like point-and-shoot models. However, what you capture in VR180 is far more immersive. You’re able to create VR photos and videos in stunning 4K resolution with just the click of a button.

Today, we’re publishing the remaining details about creating VR180 videos on github and photos on the Google Developer website, so any developer or manufacturer can start engaging with VR180.

For VR180 video, we simply extended the Spherical Video Metadata V2standard. Spherical V2 supports the mesh-based projection needed to allow consumer cameras to output raw fisheye footage. We then created the Camera Motion Metadata Track so that you’re able to stabilize the video according to the camera motion after video capture. This results in a more comfortable VR experience for viewers. The photos that are generated by the cameras are written in the existing VR Photo Format pioneered by Cardboard Camera.

When you use a Cardboard or Daydream View to look back on photos and videos captured using VR180, you’ll feel like you’re stepping back into your memory. And you can share the footage with others using Google Photos or YouTube, on your phone or the web. We hope that this makes it simple for anyone to shoot VR content, and watch it too.

In the coming months, we will be publishing tools that help with writing appropriately formatted VR180 photos and videos and playing it back, so stay tuned!

Go to Source

Posted by amiller in Blog, Google AR and VR
Announcing high-quality stitching for Jump

Announcing high-quality stitching for Jump

We announced Jump in 2015 to simplify VR video production from capture to playback. High-quality VR cameras make capture easier, and Jump Assembler makes automated stitching quicker, more accessible and affordable for VR creators. Using sophisticated computer vision algorithms and the computing power of Google’s data centers, Jump Assembler creates clean, realistic image stitching resulting in immersive 3D 360 video.

Stitching, then and now

Today, we’re introducing an option in Jump Assembler to use a new, high-quality stitching algorithm based on multi-view stereo. This algorithm produces the same seamless 3D panoramas as our standard algorithm (which will continue to be available), but it leaves fewer artifacts in scenes with complex layers and repeated patterns. It also produces depth maps with much cleaner object boundaries which is useful for VFX.

Let’s first take a look at how our standard algorithm works. It’s based on the concept of optical flow, which matches pixels in one image to those in another. When matched, you can tell how pixels “moved” or “flowed” from one image to the next. And once every pixel is matched, you can interpolate the in-between views by shifting the pixels part of the way. This means that you can “fill in the gaps” between the cameras on the rig, so that, when stitched together, the result is a seamless, coherent 360° panorama.

Optical-flow based view interpolation.gif

Optical-flow based view interpolation
Left: Image from left camera. Center: Images interpolated between cameras. Right: Image from right camera.

Using depth for better stitches

Our new, high-quality stitching algorithm uses multi-view stereo to render the imagery. The big difference? This approach can find matches in several images at the same time. The standard optical flow algorithm only uses one pair of images at a time, even though other cameras on the rig may also see the same objects.

Instead, the new, multi-view stereo algorithm computes the depthof each pixel (e.g., the distance to the object at that pixel, a 3D point), and any camera on the rig that sees that 3D point can help to establish it’s depth, making the matching process more reliable.

standard vs high-quality stitching.jpg

Standard quality stitching on the left:Note the artifacts around the right pole. High quality stitching on the right: Artifacts removed by the high quality algorithm.

standard vs high-quality stitching b&w.jpg

Standard quality depth map on the left:Note the blurry edges. High quality depth map on the right:More detail and sharper edges.

The new approach also helps resolve a key challenge for any stitching algorithm: occlusion. That is, handling objects that are visible in one image but not in another. Multi-view stereo stitching is better at dealing with occlusion because if an object is hidden in one image, the algorithm can use an image from any of the surrounding cameras on the rig to determine the correct depth of that point. This helps reduce stitching artifacts and produce depth maps with clean object boundaries.

If you’re a VR filmmaker and want to try this new algorithm for yourself, select “high quality” in the stitching quality dropdown in Jump Manager for your next stitch!

Go to Source

Posted by amiller in Blog, Google AR and VR
Chromebook tablets for versatile learning

Chromebook tablets for versatile learning

This past January, students in Kristine Kuwano’s third grade classroom were buzzing with excitement at De Vargas Elementary School in Cupertino, California. Tasked with writing out math equations to upload to Google Classroom, the students grabbed their new tablets from the cart, pulled out the stylus, and logged into Chrome. “They love technology and they have grown up working with touch devices, so tablets are intuitive for them,” said Kuwano.

Since their debut, schools have chosen Chromebooks because they are fast, easy-to-use and manage, shareable, secure and affordable. We’ve listened carefully to feedback from educators around the world, and one common theme is that they want all the benefits of Chromebooks in a tablet form.

Starting today, with the new Acer Chromebook Tab 10, we’re doing just that. It’s the first education tablet made for Chrome OS, and gives schools the easy management and shareability of Chromebook laptops. With touch and stylus functionality, this lightweight device is perfect for students creating multimedia projects—and also comes with a world of immersive experiences with Google Expeditions AR.

Chromebook Tablet_Versatile Learning_2.jpg
The new Acer Chromebook Tab 10 is easy to pass around the room from student to student.

Shareable, secure, and easy to manage

Whether overseeing 100 or 100,000 devices, IT admins can manage these new Chromebook tablets alongside other Chrome devices with the Chrome Education license. This lets students access everything they need to learn, while giving admins control from a single, scalable console.

Because Chrome OS lets students securely share devices, Chromebook tablets are perfect for computer carts. Just like Chromebook laptops, students can quickly and securely log on to any device for a personalized learning experience and just as easily log out from all apps when class is over. Verified boot checks security at every boot and all user data is encrypted, making each Chromebook tablet secure and shareable.

What’s awesome is we can manage these new Chromebook tablets like we manage our existing Chromebook laptops—all on one platform. We don’t have to move between different interfaces. I manage my Chromebooks here, my tablets here, all as one big fleet. Mark Loundy
Instructional Technology Specialist, De Vargas

Think outside the desk(top): touch, stylus and Expeditions

These new Chromebook tablets are lightweight and durable, allowing students to collaborate, create and learn from anywhere. They come with a low-cost Chromebook stylus inside that doesn’t require charging or pairing. The stylus uses advanced machine learning to predict student writing for a natural writing experience.

Chromebook Tablet_Versatile Learning_1.jpg
De Vargas Elementary School student upgrades from No.2 pencil to the wireless stylus for Acer Chromebook Tab 10.

Coming soon, teachers can take students on Google Expeditions to the Great Barrier Reef, the Colosseum, and even to the International Space Station—all from the screens of their Chrome devices. And with Expeditions AR, students will be able to stare into the eye of a miniature Category 5 hurricane or get up close with a strand of DNA.

Apps for every subject

Learning apps come to life in new ways when students have the flexibility of touchscreens, styluses and tablets. Student scientists can collect field notes in Science Journal and aspiring podcast producers can record and edit stories on the go with Soundtrap. Here are a few more apps that educators love to use with tablets: 

  • Get hands-on with handwriting: Students can use their stylus to jot down notes in Google Keep without the hassle of keeping track of (and losing) paper. In Squid, students can write directly on PDFs, and “paper” types like blank, wide-ruled, and grid. With the annotation feature in Google Classroom, teachers can illustrate complex concepts and give visual feedback, as well as assign PDF worksheets that students can annotate by hand.
  • Use your tablet in every class: For educators, creative apps like Adobe Illustrator Draw turn the classroom into a design studio, and let students and teachers draw and create vector designs. Teaching math or science? Apps like Texthelp EquatIO let students show their work by hand writing any math expression and adding it to a Google Doc in one click. Coding apps like Scratch Jr introduce younger students to the foundations of computational thinking, while enabling them to be creators.
  • Bring ideas to life: Amplify storytelling and allow students to animate their thinking on an infinitely interactive and collaborative whiteboard with Explain Everything. Book Creator lets students create and publish multimedia books, and WeVideo turns the classroom into a movie studio with features like collaborative editing and green screen. 

The Acer Chromebook Tab 10 comes with support for these and hundreds of other learning applications from our ever-growing catalog of apps in the Play Store. See a sample of other learning apps on Google Play.

No one knows what’s needed in the classroom more than teachers. As we continue to grow Chromebooks, we encourage educators and parents to try out new devices and apps, and let us know what you think. The Acer Chromebook Tab 10 will be on sale through education resellers this spring—check with your local reseller for more information.

Go to Source

Posted by amiller in Blog, Chromebooks, education, Google AR and VR