Showing posts with label mixed reality. Show all posts
Showing posts with label mixed reality. Show all posts

Monday, September 28, 2020

What's up with this blog? - Still blogging?

You might be wondering why there's not much going on here on this blog since a while and if I stopped blogging. Well no, I still keep on blogging but focus my efforts more on the Valorem Reply company blog with similar contents.

 Please go here for my latest blog post.

Wednesday, June 19, 2019

Advanced Mixed Reality Development and Best Practices - Content for the MR Dev Summit 2019 Session

Greetings from London where we are having a good time at the MR Dev Summit 2019 networking and learning with passionate Mixed Reality experts.

I also delivered a session "Advanced Mixed Reality Development and Best Practices".
The presentation covered my Top 10 of HoloLens / Mixed Reality development best practices, including Deep Learning and WHY Azure Spatial Anchors are a transformational technology and HOW to develop applications.
I also had a live demo for a spatial-persisted sticky notes AR app with native versions for Android ARCore and iOS ARKit. The code is available on GitHub and I walked attendees through the architecture and the technology under the hood.

The slide deck can be downloaded here. It's quite large with embedded videos and I recommend to download it.
The demo source code for the Azure Spatial Anchors app is available here on GitHub.

Thursday, May 9, 2019

Developing Mobile AR Applications with Azure Spatial Anchors - Content for the Microsoft Build 2019 Session

Greetings from Seattle where we had a blast delivering our breakout session at Microsoft's largest developer conference Build 2019.
The title of our talk was "Developing Mobile Augmented Reality (AR) Applications with Azure Spatial Anchors".

This was my 5th time in a row speaking at Build as an external, non-Microsoft speaker. What an honor.
This time I was co-presenting with Paris Morgan from the Azure Spatial Anchors team. An even bigger honor.

The presentation covered WHY Azure Spatial Anchors are a transformational technology, what a Spatial Anchor is and why mobile AR is very much relevant even when we have advanced head-mounted devices like the HoloLens 2. I also showed lots of examples and use cases for enterprise and consumer scenarios. Then Paris explained HOW to develop these applications yourself. We also had a live demo for a spatial-persisted sticky notes AR app with native versions for Android ARCore and iOS ARKit. The code is available on GitHub and we walked attendees through the architecture and the technology under the hood. Paris also covered UX best practices including how to best place anchors and we were able to answer a couple of questions.

The slide deck can be downloaded here. It's quite large with embedded videos and I recommend to download it.
The demo source code is available here on GitHub.
You can watch the session recording here or embedded below.
There seems to have been a glitch with the recording around 21:50 minutes where the production crew had to cut out 1 minute of content. :-( I will post the MR Dev Days video once it is out which hopefully does not have such a recording/processing issue.

Monday, December 3, 2018

Content for the DevTernity Session - Advanced Mixed Reality Development and Best Practices

DevTernity in beautiful Riga, Latvia was a great conference and I had good fun delivering my session called "Advanced Mixed Reality Development and Best Practices".

The presentation covered best practices we learned while developing for the HoloLens since 2015 and my Top 10 HoloLens Developer Recommendations 2018. I also talked and demoed exciting new things with MR + AI showcasing near real-time object recognition running on the HoloLens and leveraging WinML for on-device Deep Learning inference. You can see a video of the demo embedded at the bottom of this post. I also showed how the new RS5 Windows 10 update brings hardware-accelerated inference to the HoloLens via DirextX 12 drivers.

The slide deck can be downloaded here. It's quite large with embedded videos and I recommend to download it.
The WinML object recognition demo source code targeting RS5 APIs is here.
The session was recorded. You can watch it here or embedded below:



Also, here's a video recording of my WinML object recognition demo targeting industrial scenarios:

Monday, November 19, 2018

The Sky is NOT the limit - Content for the European Space Agency ESA Keynote and Talks

The European Space Agency ESA invited me a while ago to give a keynote for the Visualisation and Science Communication track at the first ever Φ-week conference. I also was the chair for the track and made sure speakers were on time, managed questions, etc. Plus I gave another session at the VR/AR workshop and helped with the closing panel discussion that proposed advises how the ESA could leverage VR/AR in an earth observation context.

I felt really honored the ESA invited me to keynote their first event and I had a great time at the ESA ESRIN facility in beautiful Frascati which is near Rome, Italy.
The ESRIN is planning another Φ-week conference for 2019, so keep an eye out in case you want to go there as well.

My keynote "Beam me up, Scotty! Teleporting people and objects via 3D holographic livestreaming." showed how Immersive Telepresence and holographic 3D communication will play an important role for space travel and colonization. Of coursed it included our HoloBeam technology and how it is making science-fiction a reality, enabling those use cases. I also showed some AI Deep Learning research we are running for various, related scenarios and gave an outlook for future holographic projection research.
My other talk was titled "Look, holograms! A short introduction to Mixed Reality and HoloLens." and was exactly that. I also performed live demos of some of our HoloLens apps.

The ESA ESRIN also had two sketch artists on-site who created a funny, little summary of  the Visualisation and Science Communication track that I chaired and gave the keynote for.






































The slide deck for my keynote and the other talk can be downloaded here. Both are quite large with embedded videos and I recommend to download it.
The keynote was live streamed and recorded. You can watch it here or embedded below (the stream starts 5 minutes into the video):







Friday, September 28, 2018

Content for the Digility Session - Advanced Mixed Reality Development and Best Practices

I had a good time at the Digility conference in Cologne delivering my session "Advanced Mixed Reality Development and Best Practices".

The presentation covered best practices we learned while developing for the HoloLens since 2015 and my Top 10 HoloLens Developer Recommendations 2018. I also talked and demoed exciting new things with MR + AI showcasing near real-time object recognition running on the HoloLens and leveraging WinML for on-device Deep Learning inference. You can see a video of the demo embedded at the bottom of this post. I also showed how the new RS5 Windows 10 update brings hardware-accelerated inference to the HoloLens via DirextX 12 drivers.

The slide deck can be downloaded here. It's quite large with embedded videos and I recommend to download it.
The WinML object recognition demo source updated to RS5 APIs is here.
The session was recorded and the video is available here and embedded below:




Also, here's a video recording of my WinML object recognition demo targeting industrial scenarios including an Easter egg:

Friday, June 22, 2018

Content for the Unite Berlin AutoTech Summit Session - Industrial Mixed Reality

The Unite Berlin AutoTech Summit was amazing and I had a great time delivering my session at Unity's largest conference in Europe.

The title of my talk was "Industrial Mixed Reality – Lessons learned developing for Microsoft HoloLens".

The presentation covered best practices we learned while developing for the HoloLens since 2015 and my Top 10 HoloLens Developer Recommendations 2018. I also talked and demoed exciting new things to come with MR + AI showcasing near real-time object recognition running on the HoloLens and leveraging WinML for on-device Deep Learning inference. You can see a video of the demo embedded at the bottom of this post. This is a brand new demo video fitted more for an industrial and automotive context recognizing car parts and tools.


The slide deck can be downloaded here. It's quite large with embedded videos and I recommend to download it.
The WinML object recognition demo source code is here.
The session was recorded and the video is up on Unity's YouTube channel and embedded below.



Also, here's a video recording of my new WinML object recognition demo targeting industrial scenarios including an Easter egg:

Wednesday, May 9, 2018

Content for the Microsoft Build 2018 Session: Seizing the Mixed Reality Revolution – A past, present and future Mixed Reality Partner perspective

Build 2018 is a wrap! I had a blast delivering my breakout session at Microsoft's largest developer conference in Seattle.

The title of my talk was "Seizing the Mixed Reality Revolution – A past, present and future Mixed Reality Partner perspective".
I was told I was the only external, non-Microsoft speaker with a full 45 minute breakout session. What an honor.

The presentation covered best practices we learned while developing for the HoloLens since 2015 and my Top 10 HoloLens Developer Recommendations 2018. I also talked and demoed exciting new things to come with MR + AI showcasing near real-time object recognition running on the HoloLens and leveraging WinML for on-device Deep Learning inference. You can see a video of the demo embedded at the bottom of this post.

A funny anecdote: an attendee was wearing the HoloLens during my talk. I saw the white LED was turned on, so I assumed he was recording my session with the HoloLens. I asked him afterwards what he was doing and he was in-fact live streaming my session to his colleague on the east coast using the new Remote Assist app.

The slide deck can be downloaded here. It's quite large with embedded videos and I recommend to download it.
The WinML object recognition demo source code is here.
The session was recorded and the video is up on YouTube and embedded right below:


After the session I was interviewed by Lucas and we talked about why fast Deep Learning inference with Windows Machine Learning is a key technology. Watch it here:

Finally, here's a video recording of my WinML object recognition demo:

Tuesday, April 10, 2018

Microsoft Regional Director

Today, I got an exciting email from Microsoft inviting me into the Microsoft Regional Director program. Needless to say I accepted the invitation. It's a huge honor to join this small group of outstanding experts. 


I would like to thank Microsoft, the community and especially all that nominated me for RD. 

It was almost exactly 8 years ago when I got the first MVP award and my blog post started with the same words (I only fixed the bug with exiting/exciting :-). Back then I got the Silverlight MVP Award and I moved over to the Windows Phone Development MVPs shortly after doing so much Windows Phone dev. The Win Phone Dev MVPs joined forces with the Client App Dev and Emerging Experiences MVPs a while ago and formed the Windows Development MVP category. 

Now, I'm honored to be a Microsoft Windows Development MVP and also a Regional Director. 

If you don't know what a Regional Director is, RD website describes the role pretty well:

The Regional Director Program provides Microsoft leaders with the customer insights and real-world voices it needs to continue empowering developers and IT professionals with the world's most innovative and impactful tools, services, and solutions.

Established in 1993, the program consists of 150 of the world's top technology visionaries chosen specifically for their proven cross-platform expertise, community leadership, and commitment to business results. You will typically find Regional Directors keynoting at top industry events, leading community groups and local initiatives, running technology-focused companies, or consulting on and implementing the latest breakthrough within a multinational corporation. 

There's also a nice FAQ with more information.




Wednesday, January 10, 2018

Big in Vegas! - HoloBeam at CES

Last year has been exciting for our Immersive Experiences team and we reached some nice milestones and coverage with our proprietary 3D telepresence technology HoloBeam.

I wrote a post for the Valorem blog with more details about the new version of HoloBeam we are showing at Microsoft's Experience Center (IoT Showcase) at CES 2018 in Las Vegas.
You can read it here:
HOLOBEAM CONTINUES TO PAVE THE WAY FOR IMMERSIVE TELEPRESENCE

Friday, November 10, 2017

Øredev är över - Content for Advanced Mixed Reality Development Talk

Just a few hours ago I finished my presentation at Øredev in Malmö. I really enjoyed the conference and the city with nice people, great vibes and very good talks. 
The title of my talk was "HoloLens 301: Advanced Mixed Reality Development and Best Practices" and it covered advanced Mixed Reality / HoloLens development topics, including a live demo, best practices and lessons we learned developing for the HoloLens since 2015. The room was full and I received positive feedback.


The slide deck can be viewed and downloaded here but the main content is in the presentation itself. The presentation was recorded and the video is here and embedded below.
The source code of the demo mentioned is here.


HoloLens 301: Advanced Mixed Reality Development and Best Practices


This was my last conference talk of 2017. See you in 2018 at another conference with new content! 

Tuesday, October 24, 2017

AWE - Recap of Augmented World Expo Europe 2017

Last week I had the pleasure to give a talk at AWE Europe and was also able to attend it with nearly 1,500 other attendees, 100+ speakers and 90+ exhibitors.
AWE was a great conference and allowed me to connect with old and new contacts but also to experience lots of new hardware, software and attend some very good sessions.
This blog post gives a little recap of the event and the things I experienced using photos I've taken. It starts with the keynotes, after that it highlights a few great sessions and finally covers a collection of new devices and experiences I've seen at the expo. Make sure to read till the end and check out the great summary at the Valorem blog.
The content for my talk can be found here

Sessions

AWE was kicked off with a keynote by Ori Inbar who was sporting a jacket with illuminated AWE letters. Definitely a fun way to start a conference!
All sessions were recorded and will be posted on the AWE YouTube channel.





A couple of gold and silver AWE sponsors were also invited on stage in a "press conference" session to announce their new products. 







emteq's ew facial capturing glasses were particularly interesting:
Emteq’s unique OCOsense™ smartglasses contain many tiny sensors in and around the glasses frame and use AI/Machine Learning to read and interpret the wearer’s facial movements and expressions, without cameras, wires or restrictive headgear. OCOsense™ will be available in 2018 and will combine both real-time facial expression tracking with a cloud-based analytics engine, facilitating emotional response analysis for scientific and market research.





















After the keynotes and press conference the different tracks started.
Ed from Scape kicked off the Creators/Developers track talking about their city-scale location tech he and his team are developing in London. He didn't share a lot details as they are in semi-stealth mode but what the topic reminded me of the ARCloud and how important large scale location AR is. 



Shortly after Ed it was my take on Massive Mixed Reality.




Harald Wuest from VisionLib (a spin-off from Fraunhofer IGD) talked about their CAD-model based tracking which is similar to Vuforia's Model Targets.
Kudos to Harald for squeezing in a live demo in this very short presentation time. 





I was also able to try it out myself at VisionLib's booth and was impressed with the quality and how well it worked even with varying light conditions. At the booth I also got a test target paper model which I folded already and I'm eager to run some tests with their HoloLens API. 






















Allessandro Terenzi from AR-media presented a great overview about the history of AR tracking starting with markers and the current state of the art object tracking with feature-maps and model-based approaches.






pmd talked about their Time of Flight depth sensor which is being used in the Meta 2, some Google Tango and other devices. It can also be acquired as a standalone sensor. 
The ambient-light invariance was really impressive allowing the sensor to work outdoors as well.
























A gentleman from Wayfair was talking about their approach to bring AR to retail with WebAR leveraging experimental Chromium releases at the moment.





















One of my favorite talks was given by Khaled Sarayeddine from optinvent. He broke down the optics used for different, state of the art AR see-through devices. I have never heard such a comprehensive overview of AR see-through tech in such a detailed way.






Khaled Sarayeddine is one of the few internationally-recognized optics experts and overall a very nice guy. I was able to catch up with him after his talk and he shared more interesting industry insights.



AWE Europe was held in Munich in southern Germany which is the home to lots of (automotive) industry, so there were also informative sessions covering Digital Transformation of the industry and how it relates to AR/VR.









Adam Somlai-Fischer from prezi showed a preview of their upcoming, exciting AR features.
Pretty cool feature and maybe we will be giving presentations in the future via AR. 
























I also attended a few interesting panels covering the VR game market, the art of storytelling and how VR/AR is being used as a new medium by artists.

























Gartner presented their Top 5 Predictions about Digital Transformation which were insightful as usual. Gartner is estimating by 2020 more than 100 million consumers will shop in AR and that mobile AR (which I call Classic AR) will be dominant at least till the end of 2018.




























Expo 

The exhibition area at AWE was packed with lots of interesting booths. In fact they had so much amazing tech the 2 days were not even enough to explore everything.

Smart Glasses

Classic AR will drive the mass adoption for a while but there's little doubt that HMDs are the real Future AR. I was able to try a few new smart glasses which are available on the market.
Disclaimer: This post is no review and just a subjective write-down of my experience. Keep in mind, every device is on the market for a reason and fulfilling a certain demand. 

Smart Glasses without Tracking
These devices show a screen in front of one or both eyes but don't track the environment, so the overlays are non-spatially integrated. The form factor is very lightweight.
Some of those devices use different light guide optics (Optinvent, Epson, Google Glass) or just a simple, small display like the Vuzix which is on the other hand highly modular and configurable.






The prototype from Fraunhofer FEP in Dresden is using beam splitting similar to the Meta 2 display tech. The Fraunhofer prototype is super low-power and the screens act as a camera at the same time performing eye tracking, so I was able to scroll the UI with my eye movement. 





























Smart Glasses with Tracking
The ODG R-9 provides a wide field of view and the new DAQRI Smart Glasses also had a decent FoV. The DAQRI Smart Glasses use a nice form factor running the computing unit in an external box that can be clipped to the belt. The rendering quality was great with the DAQRI as well.
Both device's inside-out tracking capabilities still lack quite behind the HoloLens which is no surprise considering the HoloLens has a dedicated HPU and went through intense calibration. 








Misc Artists 

More impressions from AWE and some interesting things I spotted like a few very immersive VR games and prototype shoes to walk in VR.









































The artist Sutu performing live painting with the HTC Vive and TiltBrush.



A full-blown VR welding simulator using AprilTags to track real welding equipment. The helmet has a light array and 2 cameras outside and a screen inside showing the Augmented Virtuality experience. The metallic material simulation and the real hardware integration was very impressive. 























The German startup Holo-Light showed a prototype of a pen they are developing called Holo-Stylus which is supposed to provide 3D pen input. The device leverages optical infrared tracking using an IR camera bar mounted on top of the HoloLens. The prototype was using Wi-Fi to connect with the HoloLens but the final product is supposed to work via Bluetooth. 
It's an interesting concept and I'm curious to try it out once it's more mature and works flawlessly. 



InsiderNavigation showcased their software solution for indoor navigation using SLAM-based point cloud mapping and AR overlays. 



Last but not least I was pleasantly surprised that Vuforia showcased our very own Tire Explorer HoloLens app at their booth as an example for Model Targets.