Tag Archive : Augmented Reality

/ Augmented Reality

See the Future and Change It with Perceptive Technologies

Your success depends on getting the best performance,
reliability and efficiency from your equipment. You’re often being asked to do more with
less – and to above all, do it safely. Today’s technologies are helping to make
those things possible. When combined with a team that has years of
experience in the mechanics of industrial equipment, troubleshooting, diagnostics and monitoring,
facilities are able to maximize output and profitability. As an example this quarry was spending a lot
of time manually monitoring the dewatering system to prevent pump failures and flooding
from intermittently rising flood waters…, as part of the risk mitigation planning, Perceptive
Technologies deployed a cost-effective solution to address the problem, continuously monitoring
and analyzing pump performance, motor health, supply power, phase condition, water levels,
and connectivity. 24 hours, 7 days a week monitoring definitely helps keeping an eye on the production and making you immediately aware of any issues. We chose Regal. Regal is very innovative. They have the engineering backing to build the product that we were after. We’re gaining production due to less downtime when we do have issues with pump power, supply power or water levels. We are interested in installing this pump monitoring system on other pumping locations. Perceptive Technologies offers advanced services
for machinery diagnostics to solve difficult challenges. In this rotary kiln application, the customer
recently experienced 2 serious failures of the motor shaft when the previously installed
fluid drive was replaced with a VFD for better speed control. We contacted Regal and Regal upon explaining the situation said could help us with this by a putting a strain gauge on the shaft and see what kind of torsional vibration we were having. With Regal’s help, we determined that we were needing another coupling or something else to dampen this torsional vibration. If we continued at the rate we were going to do, we were going to break another shaft. To solve the problem, Perceptive Technologies performed
a torsional vibration analysis, which allowed for mechanical tuning of the drivetrain using
a Kop-Flex® Max-C coupling to avoid harmful resonance conditions, which were failing the
motor shafts. Changing the coupling dampened the vibration which allowed us to run the fan on a VFD without any problems. The one thing that set Regal apart is the fact that they could break it down into layman’s terms on what we were having. They could explain to the plant personnel and make them feel comfortable in what they were doing and why it would make it better. So when its easy to talk to someone like that, your confidence is built in a company that they represent. Complimented by Regal’s extensive product
portfolio, we provide more than data. As a manufacturer of industrial power transmission
components, we also use these technologies to keep on top of our production assets. For this metal forming application, we utilize
continuous online monitoring of several key parameters so that we can perform maintenance
only when the condition warrants it. That eliminates unnecessary shutdowns and
minimizes unnecessary safety risks. Our Perceptive Technologies products and services
have been successfully implemented in a variety of industries. And although the applications may be different
and each customer is unique, our solutions are flexible and adaptable to each situation. We understand your company expects more from
you. We can help you get more out of your equipment. Let us know how we can help your facility
with solutions that best fit your needs, and ultimately, do more with less.

Future of Fact | What Happens Next | Retro Report

I’m a captive audience in this immersive world. You’ve got my attention. Don’t squander it. Welcome. Suit-up sequence initiated. Biometric signature required. Virtual and augmented reality platforms are sweeping the world of gaming. There are games that go deep, and create richly immersive virtual worlds. And there are games that go wide, augmenting the real world and making the game appear anywhere you go. In 2016 alone, about 2.3 billion dollars in investment poured into companies working on virtual and augmented reality platforms. For some, it’s a bet that the technology is poised to cross the threshold from toy to tool. Turning the world into a screen may sound like a dystopia to some. “They called our generation the missing millions.” But backers of the technology say there could be something irresistible in a device that displays facts you need to know, right when you need to know them, right before your eyes. So many things that we now have anxiety around, about our interaction with the physical world, will just stop being an issue. You’re at a water park, and you don’t know where your child is. With a wearable device, you can see them, through buildings, through anywhere, they’re over here, they’re over there. And what about more complex facts? Some journalists say immersive storytelling could be a powerful way to capture attention of news audiences, so they don’t just think about events, but experience them. How am I going to stay relevant, and get you to care about stories that are on the other side of the world, that you’ve started to block out? I look at these immersive storytelling platforms as the next step in the evolution of journalism. One of the most straightforward methods simply allows viewers to look around. “On a rooftop above Falluja, an Iraqi sniper takes careful aim at an ISIS soldier.” When the New York Times’ Ben Solomon rode along with Iraqi special forces liberating Falluja in 2016, he took a 360-degree camera. The result was a mix of war reporting that you’d see on TV, alongside moments of something unmistakably new. These are the cells where ISIS would hold their prisoners. As the door shuts, the freedom to direct your own gaze emphasizes the confinement of the space in way that a fixed perspective just can’t. The fact, here, is something you feel. So, as a journalist, you’re always trying to give people what Martha Gellhorn called “the view from the ground.” But with virtual reality, now, I’m asking my audience to do something much more intense. I’m asking them to be on-scene, to take me out of the picture, and become the witness themselves. Nonny De la Peña and her collaborators are going further than 360 video, creating densely-researched projects that let viewers explore a street bombed in Syria, or fly above a melting glacier in Greenland. We use Google Maps. We use photographs. We use video. We’re very thorough, and very careful, to use the real source material to inform what we build. All that material helps recreate experiences like this one, with Frontline, which takes viewers inside a solitary confinement cell, alongside a man who spent years living in one. “So, you know, I would take blood and I would write messages all over my cell, you know, help me.” The screen disappears. You’re no longer separated from the material that you’re looking at. You’re inside the story. And that gives you this incredible feeling of presence. A feeling like your whole body’s on-scene, and you’re witnessing an event as it really unfolds around you. “I cut myself thousands of times. Just over, and over, and over, and over.” This immersive project, created by conflict photographer Karin Ben Kehlifa, is even more interactive. It draws on interviews with fighters on opposite sides of conflicts in Israel and Palestine, El Salvador, and the Democratic Republic of Congo. Those who attacked us killed my mother and my father right before my eyes. The project, called “The Enemy,” transforms the source material into digital avatars of each man. They can appear in an entirely virtual space, or inside your living room, with the aid of a phone. It doesn’t feel seamlessly real, but it’s still gripping to share their space and listen to them, as their eyes follow you around the room. When you listen to those fighters, realize what they’ve been through, how much hope they still have, how much humanity they still carry. When people go in, they know those fighters exist. They even have a memory of having met those fighters once they come out from there. And I can see that some people are very emotional when they leave this, this experience. But some critics worry that more emotion may be the last thing we need in news. Today, faith in the press has eroded, at the same time new platforms enable new ways to push our buttons, with fake news stories, videos that can make anyone say anything, and outlets that whip up feelings to reinforce particular points of view. If a fake Facebook post can trick you now, imagine what an immersive storytelling piece could do to you, right? The moment at which our bodies and our minds really believe we are someplace else, that is an experience that really threatens deliberation and judgment. One bad scenario is that everything becomes even more about confirmation bias, and people will just completely disconnect from anybody who doesn’t already agree with them. If we are using media to create multiple and competing social realities, then we are imperiling the future of what counts as a fact. Whether you’re talking about screens that replace reality entirely, or ones that simply augment what you’re seeing everywhere, what feels new here is the degree to which this form of communication could short-circuit judgment by getting right in your face. Yet there are precedents for this. There are certain things that have a particular ability to capture our attention. And, you know, today that’s the screen, but the first iteration was the poster. The poster, when it came, out was a sensation. Especially in the late 19th century, in France, when they started using bright colors, moving images, sexy women. People were astonished, they couldn’t believe this thing. They said, you know, it controls the mind, it’s out of control. There’s still pretty strenuous laws, even in New York City, as to where you can have posters. That’s why there aren’t that many eye-level posters in New York. You may not realize that, they’re usually up. It’s because they banned posters being right in your face at all times. People worry that, if we’re giving people an embodied experience, that it’s a subjective experience, and it can’t have the transparency and authenticity that journalism has before it. VR will be used for propaganda. It will be used badly for journalism. It’ll be used for incredible films. But that’s always going to be about, who’s the maker? And it’s not about the medium. And that could mean that the best defense we have against manipulated facts on immersive platforms may well be the same non-technological defenses we’ve been using all along. When misinformation comes, journalists need to be there, to be like, “Okay. You know me. You know me. You can trust me. You know my track record. You know my credibility. You know my history. You know my values and what I stand for. That is fake. That is real.” Every information technology comes with this twin possibility of greater education and information, but greater capacity to manipulate and deceive. The technology it takes to seize people’s attention, and make them pay attention to facts, has always been a double-edged sword. And it seems every era’s next layer of innovation ends up making that sword a little bit bigger. So, you know, once upon a time, the persuaders were, basically, out on the street, in posters, maybe the town crier. Then it slowly moved into the house, that was radio and the television. Then it came closer to us, with the phone and the computer screen. Now, the future is one where maybe it’s all over your body, even close to your eyeballs, maybe plugged into your brain, you know? So it’s getting closer and closer and closer to us. So where is the future of fact in this medium? We’re just barely getting a glimpse of what it’s going to look like.

OMG this actually HAPPENED!

November 29, 2019 | Articles, Blog | 14 Comments

OMG this actually HAPPENED!

What is this? Don’t tell me this is what I think it is. I didn’t even give them my name. I have to open this right now! [Inspirational music plays] Oh my god, why has no one shoveled this yet? Yep! My feet are soaking wet right now. Oh my god! We did it. We did it! Sneakers were a bad choice. Oh, yeah. That’s bright! Red? Blue or red? Blue! Yes! Oh yeah. Looks like they wrote me a little note. “Hi SkyGuytheJedi, we hope you enjoy this gift. Thank you for your great contest submission We’re glad you enjoy our game, and we’re glad to have you as a member of the Star Wars Jedi challenges community from Lenovo PS: We are currently working on the next version of the game and would like you to come down to our super-secret Jedi facilities to test out the new lightsaber battles since you are clearly the best player / Jedi in the world and also if you do this we will give you lots of money and tell Disney to put you in the next Star Wars film.” I can’t believe they just said that. I mean they definitely said it. There’s no reason for me to lie about it. Oh, I think I’m in shock Okay So a little bit of backstory here. A few weeks ago Lenovo asked its players in its forums to create and submit a video explaining why they think Jedi challenges is awesome and whoever’s video they liked the best would get a mystery prize. This is it! Check it out. This is the starry night version of a Millennium Falcon. Oh, and I think there’s more. Yeah, got some x-wings in there. Tie… I believe that’s a TIE Interceptor and X-Wings targeting TIE Fighters. Check these out. You know a little fun fact: these were actually painted by a very famous artist named Vincent van Gogh, in the year 1889. That was like a thousand years ago Can you even like comprehend… like, isn’t it amazing how ahead of his time this guy was? It’s insane. So I have to frame those immediately and I know exactly where I’m gonna put them. You see that empty wall of real estate right there? And that silly family portrait? Yeah, that’s all coming down. These are going up. Right there. Mm-hmm. Thank you Lenovo for these wonderful prints. They’re amazing and they’re really going to spice up my my backdrop over there for my videos in the future. I was literally running out of Star Wars stuff to put up there so I really needed these. And to everyone else watching this video Thanks for watching. Tell me what you think about the game. What do you think about the prints? What do you think about Vincent van Gogh? Have you ever heard of him before this? He’s quite the painter, am I right? And before I go Let me know what you think about that magic fireplace. That was a little delayed but… still… pretty… cool? I don’t know. Alright guys that is all for today I hope you enjoyed this video. If you did, drop a “Like”, and if you want to see more I post new videos every Friday, and if I have time, every Saturday. So “subscribe” if you want to keep up-to-date on the latest news, tips, and gameplay footage for Star Wars Jedi Challenges. And to all my fellow Jedi Masters, remember: [Yoda]: “Pass on what you have learned. That is the true burden of all masters.”

ChromaBlur: Rendering chromatic eye aberration improves accommodation and realism

Traditionally, rendering has focused on “photorealism”: simulating images from cameras with minimal aberration. But rendering for virtual reality should emphasize “perceptual realism” to enable immersion in the virtual environment. To this end, we shouldn’t neglect the imperfect optics of the human eye. We introduce a rendering method that incorporates natural aberrations and thereby produces retinal images that are much closer to what people normally experience. We first calculate the retinal image that would be produced by a 3d scene given an appropriate eye model. Our model incorporates two universal optical effects: defocus and chromatic aberration. The calculated image is the target retinal image. We then solve an inverse problem to determine We then solve an inverse problem to determine what image to put on a display screen that, what image to put on a display screen that, when viewed by an eye, will produce the same image on the retina as the target. We call this algorithm ChromaBlur. Here is the target retinal image for a 3d scene and its depth map for a horizontal cross-section. Focus distance is just over 2 diopters. With Conventional rendering, this is the displayed image associated with that scene and focus distance. With ChromaBlur, this is the displayed image. The right panels show the differences for red, and blue between the target retinal image and the retinal images produced by Conventional and ChromaBlur rendering. Gray represents no difference. As focus distance is changed, displayed images are updated accordingly. ChromaBlur produces much more accurate results than conventional rendering. We investigated whether ChromaBlur rendering drives accommodation, that is, whether it guides the eye’s focus. Stimuli were projected onto a screen. And viewed by one of the subject’s eyes. A focus-adjustable lens and aperture were just in front of that eye. Accommodation of the other eye was measured with an autorefractor. There were three conditions. In the Real Change condition, optical distance was changed by manipulating the power of the adjustable lens. In the Conventional condition, simulated distance was changed by altering blur with conventional rendering. In the ChromaBlur condition, the simulated distance was changed again in rendering but now with it color correct. The power of the adjustable lens was not changed during Conventional and ChromaBlur trials. Here’s an example trial and response. The stimulus is initially at 0 diopters, and accommodation is accordingly at 0. The stimulus then jumps to 1.4 diopters. A third of a second later, the eye accommodates to roughly that distance, a very typical response. Here are the results in all 3 conditions. The first panel shows focusing responses to Real Changes in optical distance. They are accurate and consistent. The second panel shows responses to changes in defocus only (that is, to conventional rendering). The eye does not accommodate, which means that conventional rendering does not drive focusing responses. The third panel shows responses to changes in defocus plus chromatic aberration (that is, ChromaBlur rendering). Remarkably, the eye accommodates much like it does to real changes. ChromaBlur drives focusing responses quite effectively! We showed in another experiment that the response to ChromaBlur persists for stimulus durations of several seconds. We also showed that ChromaBlur continues to drive accommodation effectively when the display resolution is equal to or worse than current HMDs. We also investigated whether ChromaBlur increases the impression of real depth. Images of complex 3d scenes were viewed by the left eye. They were rendered Conventionally, with ChromaBlur, or with Reverse ChromaBlur (which is a reversal of natural chromatic aberration). Subjects saw two stimuli on each trial and indicated which yielded a greater depth impression. ChromaBlur stimuli yielded consistently greater impressions of depth than Conventional. ChromaBlur stimuli also yielded consistently greater depth impressions than Reverse ChromaBlur. Reverse ChromaBlur and Conventional yielded roughly equivalent impressions. Thus, ChromaBlur rendering enhances the impression of depth. ChromaBlur opens an opportunity for next-generation displays. One can couple ChromaBlur rendering with focus-adjustable lenses and eye tracking. When the viewer fixates a new distance in the virtual scene, the tracker senses it. This triggers rendering for the new focus distance. By using ChromaBlur, we assure that accommodation will occur quickly in the right direction. The fixation change also triggers adjustment of the lenses in front of the eyes so that the lens inside the eye will adjust just as it would in the natural environment. Recreating the natural relationship between accommodation and blur would also restore the natural relationship between accommodation and vergence, thereby eliminating the vergence-accommodation conflict and the various issues associated with that conflict. Our ChromaBlur rendering technique creates displayed images that, when viewed by the human eye, create more realistic retinal images. This allows greater perceptual realism, which, in turn, should enable more comfortable, immersive, and engaging experiences in virtual environments.

Augmented Reality bei Vorwerk

October 15, 2019 | Articles, Blog | 1 Comment

Augmented Reality bei Vorwerk

My name is Julia, I have been with Vorwerk for three years and have been with our suction robot for 1.5 years. My name is Guido and I have been with Vorwerk for many years and work in product management. I am the man for all product questions in the big, wide, German Kobold world. The benefit of Augmented Reality I would say is, above all, that you can communicate more abstract themes and technical themes in a way that reaches the viewer on a different level. Augmented Reality is nothing more than an augmented reality. This means you have a technical tool and by adding computer-generated images, animations or videos I have the opportunity to enrich more information. It’s just more fun to look at this information than looking at a datasheet. Particularly with new products, we prepare and process a great deal of product knowledge. However, this is usually always in a printed form, paired with a thick instruction manual. We have often noticed that perhaps not every page is read in this way in detail. How do you explain a really great product that has really great features? So it was obvious to work with Augmented Reality. And as the VR300 has also explicitly positioned itself as a “connected” device That means the main features are in the app and explaining that to the customer is a very abstract level. There’s definitely a focus on the sensors. The robot is incredibly intelligent. He always reacts up-to-date to the situation he is in at the moment. Accordingly, we have installed four sensors, which we can now show with Augmented Reality. That’s one thing and the other is also very nice that you just see how the air stream enters the robot dirty and is then blown out wonderfully clean again. The AR-App for the VR300 is intended for distribution, in the sense of demonstration material. As well as for our stores, that you can get a closer look at the product. At all levels of Vorwerk it has now become clear that Augmented Reality is extremely suitable for us. Because we also have increasingly complex products. I very much hope that we will continue to develop this topic further. Of course there are a lot of ideas what you can still do with the fact that we can perhaps develop exciting things for our customers and at some point even develop a product in this direction. But we’re not there yet to talk about it right now.

IE2: Innovative Technologies with Tom Earp

October 14, 2019 | Articles, Blog | No Comments

IE2: Innovative Technologies with Tom Earp

On this episode of Inside Engineering we talk with Tom Earp about the cool things he gets to do in his role as Innovative Technologies Manager and how he stays on top of what’s next in the industry. Inside Engineering untold stories and fascinating people from the world of civil engineering. This is Episode 2. Recorded in September 2019. Innovative Technologies with Tom Earp Inside Engineering is brought to you by RK&K. Learn more at rkk.com Welcome back to another episode of Inside Engineering. We’re here this week with a good friend of mine, the Innovative Technologies Manager at RK&K, Tom Earp. Tom, welcome to the podcast; we’re excited to have you here today. Thanks Tim. I’m excited to be here. OK. So, Innovative Technologies Manager — that sounds it sounds like a cool title. Walk us through what that means because there’s a lot of things that sound like they could fit under that umbrella. What does it mean to be the Innovative Technologies Manager? It means I get to play with all the fun toys. See, I knew it was a good title! No, actually it is a lot of fun. So that’s that’s the first part. So part of what I do involves working with with the GIS side of things, so geographic information systems, for those we don’t know. Also, I get to play around drones; again that’s part of the fun toys; augmented reality and virtual reality. So getting to to explore those technologies. But really the biggest part of what I’m doing now is making sure that firm-wide that we are looking forward we’re looking ahead at kind of what’s next. I mean that’s a big part of what we do; we have to constantly be moving forward. You mentioned a few different technologies there in the description of what you’re doing can you break down some of the technologies that you’ve had a chance to work with over the past couple of years, few years and and maybe even some things that looking forward are technologies that you know might be under evaluation or anything like that. Sure, so we’ll start with GIS; that’s that’s kind of what I started doing. GIS has been around for a long long time, since the mid 60’s actually, but in the last let’s say five years the move to cloud — so moving GIS into the cloud — has been a huge game changer. So we we here at RK&K moved to the cloud about four or five years ago and really that’s helped our field staff get their work done in the field using iPads and then just that cloud connection, kind of real time GIS has has been a really big technological innovation. So that’s one and that’s again that’s something that’s constantly evolving. More recently drones. So that’s a hot topic. We’ve been working with drones for about two and a half years. So again we started off kind of looking at some different uses how we might use that kind of in marketing and then it’s slowly becoming more and more adopted in other areas. So we’re using it for construction inspection management type things so we’re documenting sites, but we’re also using it to provide some survey-like information. So, point clouds, developing surface models. So that’s really kind of starting to take off. We’re looking to do some bridge inspection work. So structure inspections, things like that. It’s obviously a big safety issue when we’re talking about getting people up on rigging and on snoopers. So being able to fly drone to do that I think we’ll be more and more common as kind of we go to the future. Something else we looked at: augmented reality and virtual reality. So again this is kind of some that we’re just starting to look at. So we’ve we’ve actually taken this technology out to some trade shows where we’ve been able to to demonstrate to clients and people at these shows kind of how we’re we’re using this right now and then what the possibilities are. So it’s really exciting maybe to look at how you know those technologies might really help our clients because I think our clients don’t really know yet kind of how they they want to use these technology. So those are things we’re working on right now. Down the road we’re looking at AI and machine learning. So that’s another kind of hot area. Everyone’s everyone’s talking about AI and how how machines are going to kind of take over. Skynet. Yes. So we’re looking at how we could possibly use these technologies here. You know there’s there’s definitely an application of this technology computer vision, looking at images and classifying what what the images are. So there’s a lot of opportunity there as well. That’s some cool stuff. Let’s walk through a couple of those things because maybe we can talk a little more in depth about some of the specific applications. So you talked about drones. You’ve inspired me to get my drone pilot license recently, so that’s that’s exciting, but one of the opportunities that we have — and you and I have been out on a bunch of different flights together — but one of the opportunities we had recently was to test out, you mentioned inspections, and to test out some drone technology that would help increase the safety and efficiency of bridge inspections. Can you tell us a little bit about that opportunity that we had and sort of some of the challenges that those inspectors face compared to what the drone brings to the table? Sure. So obviously getting people up on a bridge that may be 150 feet off the ground. Which this one was. It was. You know there’s there’s a safety issue there right there. We’ve had here at RK&K people have been in danger during inspections. So that’s part of it is a safety thing. The other is, for this bridge inspection it took them over a week to do it. So they had to have Maintenance of Traffic out for a whole week. Closing a lane down. Closing a lane down a bridge on a bridge. And you know I think if we were able to use the drone to do maybe some initial inspections or some some initial conditions fly through and then evaluate where we need to to actually get our hands on the bridge, that might help eliminate some of that that MOT that we needed for a week. You know on very busy roads that could be a huge time saver, money saver. And an increase in safety for sure. Absolutely. Anytime there’s there’s traffic management happening there’s always an increased risk to to the workers and the travelers moving down the roadway. Yes that was that was a really cool opportunity. I mean you know we had guys, we had some folks on a snooper on the bridge and so anyone who doesn’t know a snooper is a truck that has a big boom arm that booms a bucket under the bridge. So they’re on the bridge looking at that. Then we had some other guys suspended from a wire sort of on a chair with a a winch. They would literally just go up and down the piers and and look at them real in-depth and with the drone we were able to take a high resolution camera, sit back a good distance from the from the pier, and capture imagery that can be used to make an evaluation of that. Yeah. So the other thing with the particular rigging for this pier that we were looking at the bridge inspector couldn’t get the full width of this pier. They couldn’t get out to measure a defect that was out towards the edge. They noted it, but they weren’t actually able to measure it. So we were able to take photos of the entire structure, the entire pier structure, stitched them together and measure the actual size of the defect. And we were also able to do it much much faster. Yeah so time savings is another great way and using drones could could help these inspections where we’re not spending as much time kind of in the rigging on the bridge. Right. There’s also some cool technologies that you can attach to drones in terms of different kinds of cameras and stuff. You want to talk about some some of the different uses for those? Sure. So for this particular bridge inspection that we were just talking about we were testing out a zoom camera. So this camera had I want to say 30x optical zoom, 6x digital. So in our testing we were able to see objects over a mile and a half away and know the kind of what cars they were in and things like that. So really high end zooming. We were able to, when we’re on the bridge, to see individual bolts way up. Again we were 150 feet below the bridge. So just using that zoom capability allows us to be farther away from the structure and then get really high detailed resolution images. So that was part of our testing. Obviously we have regular cameras that we can use and we did some of that while we were out there and that’s just a regular regular camera that’s on most consumer drones. We still like to do some more testing on other sensors. There’s thermal sensors that we could we could look at. We have a drone right now, actually we were out flying it today, it has a multi-spectral sensor so it’s doing infrared, so we can look at plant health and things like that. And that plant health in particular, we’ve had some work around relocating seagrass that I think we’ve used that technology. We looked at it, this seagrass happened to be underwater. So we were a little bit too far underwater to actually make it work for this but that was the intention. But with the right level of depth for that seagrass but this would be a good application. Potentially. Yeah potentially. You also mentioned AR, augmented reality. There’s a lot of talk around AR and VR these days for a bunch of different uses. Can you give an example of what a use case might be for augmented reality inside of civil engineering? Sure. One great use is getting shareholders so or stakeholders involved in the process. So you can take a design and then we could get around a table and you can have multiple devices kind of looking at a model and interacting with it. So an example that we’ve done is we’ve done some bike lane alternative development, so we were able to take a few different bike lane alternatives put them into the AR goggles, in this case the Microsoft Hololens, put it out on the table and we can kind of cycle through them and discuss them in front of you know potential stakeholder. So that’s one great use case for it. There’s lots of others that we’re also exploring. So at the trade show we were… Some of the trade shows we’ve been going to recently have been water/wastewater type trade shows. So pump stations. We’re able to to show a design of a pump station. We can scale the model so you can see all the details inside and then again place that either on a table to discuss it or actually we’ve shot video where we placed it on a site full size. So you could potentially go out to the site where that pump station would be and actually visualize it and walk around or through the site. Absolutely. All in augmented reality. It’s pretty cool. Yeah. So again a client could see what something might look like in the in the field before it was constructed. That’s really neat. OK so we’re doing all this stuff. There’s really cool things that you get to do. I mean but at the end of the day it comes down to, ‘how does this help our clients?’ So how do the things you do help our clients or help our people, who then in turn help our clients? Sure. Can you talk about that? So let’s talk internally first. So one of my roles is to kind of connect people that have good ideas on things that we might be able to do — technology we should be using. And connecting them with other people in the firm that have similar goals who are already working on this. So that way we’re not working in silos. So connecting the right people is is a huge part of what I do. Facilitating that kind of startup of an idea and that’s already happened since I started this role. A great example of this is connecting some of our CM folks in Florida who had a great idea for managing photos on a project with work that’s going on here in Baltimore. So we’ve been able to connect them and kind of get a solution that works for everyone. So that’s one way, internally. But that also benefits our clients, right? Because we’re more efficient in managing our workflows we’re more efficient managing photos. We’re able to to perform better on a project. So I think kind of that internal win is a win for our clients. Absolutely. Yeah. I mean media asset management is increasingly a significant challenge of, where do you store this big media? How do you store it? How can you find it easily so that you can spend more time helping the client and less time looking for a picture or video of something? That is a really good example of connecting different ideas together. So you’re helping the clients, ok, but then, I mean you have to know that what you’re doing is a success or not so that you know whether or not to continue doing it and I think that’s always a challenge no matter what field you’re in. How do you measure the success of what you’re doing? Yeah, that’s kind of hard because sometimes you know how do you measure efficiency? Sometimes you can measure some but you can’t. So in some cases where we’re able to do more work in less time. So if we go back to the iPads, using them in the field to collect data right? Instead of writing all your information on a form and then coming back to the office and digitizing them in some way. You’re collecting it in that format right away. So you know obviously hours spent, that’s an efficiency gain. So that’s one way that we can can measure kind of what we’re doing, but how do we measure connecting people internally that that need to know about what we’re working on? So that’s a lot more challenging and I think you know we can we can measure that by just looking at the success of the project. So you know were we more successful in delivering this particular project? So that’s one way. You know if we’re talking about drones, it could be you know, again a time saving that’s what were looking at, but safety. How do we measure safety? You know is it is a fewer accidents? Is it less time spent you know in a dangerous situation? It’s a challenge sometimes a measure exactly what we’re doing, but I think a lot of you know a lot of our technology advances that we’re looking at are gonna be efficiency, efficiency savings, time savings, letting our staff work on the things that they’re really good at instead of wasting time on things that they don’t need to be doing. Efficiency just spreads across, across everything. I mean we’re always, always looking for ways to be more efficient with something and do it better the next time and constantly learning, that sort of continuous innovation model I think is really important and it applies in your field a lot. Tom, sort of big picture question here, what’s something that you are curious about right now? There’s lots of things but I think one of the the big things I’m curious about is AI and machine learning. I think this is… this is coming. We all have to get used to it and some of us are already dealing with this now even though we may not know it. So one example is you know like if you get if you get targeted by an ad if you’re on Facebook or something like that. I’m not, but there’s algorithms in the background that’s looking at your history, what you’re looking at, and it’s presenting you with things. I think applying this to engineering and what we do here is super interesting. So you know figuring out how we can use this technology to to improve kind of what we’re doing as a firm to improve, again our efficiency and just working smarter. I’m really interested in that. What facet of that technology are you are you most looking forward to? Is there any kind of big broad brush stroke examples that you could give. So one thing that I’ve kind of been interested in is kind of this “computer vision”. So being able to either in augmented reality or using drones to look at something and then have have the AI figure out what you’re looking, at classify it in some way, and then kind of give you a result. So you know maybe it’s change detection or you know you use the drone and you’re flying the bridge and then it can detect defects automatically. So that type of kind of instant feedback is really really interesting and I think it’s something that in the future we’re gonna be doing more of. I know that in communications we use a an AI service for our transcriptions, of this podcast in fact. You know when we’re done the episode we want to have a transcript of the entire thing and so our first pass of that transcription is through an AI system. And it’s it’s good. We have to clean it up obviously but that’s… It saves so much time from having to go through and manually do it that it is a real big efficiency. And so I look forward to having kind of more of those things come along and I’m really glad that we have someone who’s specifically focused on keeping us on top of that. So I don’t know Tom what what else what else do you want to talk about. I mean what there’s there’s so many things that you do I feel like we could sit here all day. What’s something we haven’t covered? We’ve covered drones. Again that’s a big one. I think it’s just going to increase. AR’s definitely to increase. I don’t know. What’s something you’re excited.. Is there
something else you’re excited about? We talked about curious about but is there, is there a particular part of your job that you get really… really jazzed for? All of it. All of it. I know you said don’t don’t say “all of it” or “everything.” No, you know I love technology so finding ways for for our staff and for people here to be able to use technology better, more efficiently to get their jobs done better, to kind of remove roadblocks to getting their work done is something I’m really interested in. So I like having those conversations with people and then just have them kind of, you know it doesn’t always have to be about technology but they’re not thinking about a technology solution that I might be thinking about. So like hearing those things and then trying to figure out how we work that into a technological solution. Right and the thing that I think is kind of cool about your job is that not every idea gets implemented but that’s the good thing about it is that there’s a process for evaluating these ideas and seeing how we can apply them. And, hey maybe this thing over here didn’t work out now but maybe there’s an application for it at some other point. And so it’s not just a ‘yeah let’s do that!’ ‘Yeah let’s try this new technology, that new technology!’ There’s a very careful, thoughtful approach to these things. It is and and it’s not just me sitting here by myself in a room. In a vacuum. You know typing away or thinking away all day it’s it’s getting those people to the table that that have a stake in kind of what we’re doing. So I have lots of examples that you’ve been involved in in the asset management think we’re looking at, digital asset management. So really it’s bringing those people together to look at something, you know figure out how it works for us and then kind of moving forward with that. So it’s a lot of fun. I’m looking forward to doing more of it. You have a cool job. I do have a cool job. All right so we’ve arrived at the point now where Tom gets to give us his pick of the week. This is where Tom recommends something to us that he thinks we are going to enjoy. And I don’t know what it… Actually I do know what this one is he showed me what this one is early. So I’m actually excited about this. I think this is cool. Tom take it away. Pick of the week. So one of the things that I really like to do is is read. I read a lot of leadership books and just just other maybe like self-improvement type books. But a lot of them are focused on leadership. And so one of the ones I’ve read recently it’s been out for a few years. The author is Jocko Willink. He was a Navy SEAL and his his book is called Extreme Ownership. So you know it’s a it’s a really great leadership book. He takes his lessons learned leading SEAL teams and applies that, well he tell stories, and then he applies that to business and how we might apply those kind of in a business setting because you know he’s talking about being in Iraq and you know real life dangerous situations but you can apply those in the real world. And I’ve actually had to do it here at RK&K, you know using those leadership principles, to kind of take ownership of something kind of leading my team, so again it’s a it’s a great great book and I recommend it to everyone. Extreme Ownership? Extreme Ownership. Alright, we’ll put the info to that in the show notes. Thanks Tom that’s good. Books are always a good thing. Well thank you Tom for coming into the studio where we appreciate you being here and sharing your thoughts. I think we’ll probably have to have you back some other time to talk about the next cool thing that you’re working on because it’s always something. It’s always something. It’s always something cool. All right well thank you all for joining us for this episode of Inside Engineering. We’ll see you next week. Hey everyone thanks for watching this episode; I hope you enjoyed it. Inside Engineering comes out every Tuesday and we’re available hopefully on your favorite podcasting platform; we’re trying to be in as many places as possible. So please take a minute to rate and review the show. You can also stream on-demand at our website at rkk.com/podcast, where we also have just a real short survey asking for some feedback on the show because we want to make it as good as possible and as valuable to you as we can. So thanks for watching. We’ll see you next time.

Shifu Orboot Augmented Reality Educational Toy Globe Review jacksonandrowen

Hey you guys want to ride your scooter and bike down the cul-de-sac? Don’t go too fast, cause I can’t keep up. You know that right? You’re not going to wait for Rowen? Too fast. Did I make a mark? You did! Stay right there, wait for Rowen. Here she comes. Umm..no I don’t see a mark. Dang! I hit the brake hard. Hey! Hey! Hey guys! Come up here Rowen. So, while we were out on a scooter and bike ride. Guess what? What? The mailman pulled in and delivered something to the front door.

Make an Augmented Reality (AR) 360 portal in Snapchat FREE | No Coding Required!

Today, I am going to teach you to create this. [Music] The cool part is, this entire Augmented Reality
experience is created with Snapchat Lens Studio free without writing a single line of codes
or touching Unity or ARkit. So anyone can do it as long as you know how
to capture 360 photos with one of these cameras. This is a really cool party trick, as well
— an engaging way to showcase your 360 photography to your friends or strangers. I will also show you how to put this on a
business card. So when people scan your business card with
their Snapchat, the full AR experience will show on their phone without any third-party
app download. Trust me, if you do that, your business card
is the one they will keep in a busy conference. I generate so many new leads because of this
trick. So, if you want to leverage AR without writing
any codes – let’s get started! [Music] Hey, what’s up everybody, it’s your boy Hugh
here from CreatorUp – the #1 YouTube channel dedicated to 360 virtual reality, and now
Augmented Reality for everyone. Before I start the tutorial, I want to credit
the team from pixelcase, who first shared how to do this. And also to Snapchat who provided the portal
template right here on their site. Step 1, prepare your 360 photos. I assume you already know how to capture,
stitch and create beautiful 360 photos if you follow this YouTube channel. If this is your first time here, start with
this Photoshop tutorial on 360 photo editing with the Insta360 ONE X or this tutorial on
capture DNG 8 HDR 360 photo with Kandao Qoocam – which is still one of the best consumer
360 camera for 360 photo. Go ahead and open photoshop. Import one of your finished 360 photos. I will suggest an HDR 360 photo captured with
DSLR or high-end camera as we need to compress this a lot for Snapchat. Here I will import my Malibu sunset HDR 360
photo – capture with a drone and 360 camera. I want when people walk into my portal, they
feel like flying. B/c 90% of the thing when I show this to people,
they look down immediately when they walk into the portal. A drone shot with interesting ground usually
is the best for this. If you don’t have a drone, I will suggest
pointing the 360 camera out of a window in a tall building and capture high altitude
360 photo. Then, we duplicate the background layer, delete
the background layer as we don’t need that anymore. Go to edit, transform, Flip horizontal to
flip the image as we are going to minor the image on Snapchat later. Then go to image, image size, and change the
image size to 2048 by 1024. Snapchat has a size limit for 360 photos. Then go to File, Export, Export As, and save
as jpg around 80% quality will be good. The goal here is to keep the file size as
small as possible, so Snapchat will approve your creation. Step 2, download Snapchat Lens Studio and
Portal template. So go ahead and follow the link in this video
description page down below to download Snapchat Lens Studio and my portal template. Go ahead and open Lens Studio. And import my Snapchat template. The template I provided here is very bare
minimum. If you want to create fancy 3D effects and
motion-triggered animation, I will highly recommend reading this official document from
Snapchat – so you can create a way better-looking portal than the one I created. If you do, send me the link in the comment
below and let me download your creation. After you open up my portal template, you
see everything has already been created for you. You can click the hand icon to move around
the scene. If you use your scroller on your mouse, you
can zoom in or out to see what is really going on in this design. Open Texture, go ahead, and replace the 360
photo. Click “change texture” button here, and pick
a new 360 photo. I choose my Shanghai City night 360 photo
instead. This photo BTW is one of my most popular 360
photos on Facebook with 53 thousand people reach and 220 likes. Follow me on Facebook if you have not already
for 360 inspiration like this photo. Now, go ahead and also replace your logo here. If you don’t want to bother, feel free to
leave my logo to credit me or simply hide it here by unchecking the guided arrow module. The last thing you need to do is replacing
the right music for your photo. Remember in the demo, after you go into the
portal, a sound effect starts to play to make you feel like you are there. Click WorldObjectController right here. Then double click the audio track name to
open up a new window. Hit add new to import your music. Choose your new music and hit add to close
the window. One thing you need to pay attention to is
your app file size. If your app file size is more than 4Mb, Snapchat
will not allow you to publish your lens. So make sure the audio file you import is
as small as possible. Here the file I import is a wav file which
is totally overkilled. So make to convert it into mp3 before you
import. Now we fix the file size problem, go ahead
and pair your phone to the lens studio for local testing. Hit right here and generate a snapchat barcode. Open snapchat on your phone and point the
camera to the barcode. Tap and hold your screen and pair your phone
to the lens studio. After your phone is connected, you can hit
the Push Lens To Device to test your new build. As you see here, you can pinch and zoom the
portal in real-world anywhere you want. [Music] Before you publish your lens, make sure to
go over Project Info here, and Add Lens Preview like so. This one looks good to me. Also give a lens name and change to your own
lens icon. Hit apply and we are ready to publish your
Snapchat AR lens. Hit Publish Lens right here. Give some Tags for discovery and add a lens
preview video if you have one. Then hit the big blue submit button, then
you are done! It usually takes Snapchat 5 – 10 hours to
approve your lens – depends on your file size. As you see my lens is already published and
you can scan it and load it up on to your phone. Step 3 – print AR onto your business card. To make it user-friendly, we are going to
build a QR code and print it onto your Business Card like so. I like to add a splash page to show all my
social media, especially my linkedin on my customed made profile – as you see right here. As after the fun AR experience, I still want
people to connect me on Linkedin so we can do business later. I used QR code generator here. It allows me to add all my social media and
nice loading animation with my face on it inside Snapchat. So it is all about user experience and presentation. The little thing count. But you can use any other free QR Code solution. It will work just as well. Thank you for watching this quick and fun
tutorial. I hope you enjoy it. Give me a like and share this video with your
friends. Anyone can do this. So share this with all your friends and family
and provide them your 360 photos for integration. I am sure you have a lot of 360 photos by
now. As you notice by now, there are limitations. Your 360 photo has to be under 2K and you
can not use 360 video. If you are in China, there is no Snapchat. So in the next AR tutorial, I will teach you
to create the same portal effect with Unity 3D and ARkit. So you showcase your 360 film in a fun and
engaging way. And potentially make money for your clients. So don’t forget to subscribe and comment below
for what AR application you want to learn next. And send me your Snapchat creation or reaction
video when friends see this. I will see you next time. 360 Creator level up on CreatorUp.

AR Camera GYRO ??? Augmented Reality for Unity

AR Camera GYRO: Augmented Reality for Unity Demo Scene with Unity Assets:
AR Camera GYRO & AR Shadow Demo of Gyroscope behavior AR Camera GYROuses the camera and gyroscope on the player’s mobile device to display 2D or 3D objects as though they were in the real world. Note: Cross-platform mobile camera implements markerless augmented reality (AR) by using a gyroscope like Pokemon GO. Get AR Camera GYRO: