Engineering Proposal

Jonathan Anwar

Professor Otte

Writing for Engineers

December 4, 2020

Engineering Proposal

Purpose:

The purpose of this proposal is to get funding and approval to continue research and start design of a device equipped with LiDAR sensors that can detect obstacles in front of it that can be used for navigation for those that are visually inpaired.

 

Summary:

We live in a day where technological advancements are made daily, however people who are visually impaired have been using the same guide sticks since before the 1930s. I would like to develop an app that utilizes the LiDAR sensors on the iPhone 12, iPad pro and select Android devices. LiDAR sensors have their own illumination source. It sends out its own series of light pulses and measures the time it takes for those pulses to be reflected back. This allows the phone to be able to tell the distance of things in front of the device based on how long it took for the light to bounce back. The app would warn the user if there is an obstacle in the way or if there is a sudden change in terrain or the ground ahead.

Developing an app that can sense:

  • Walls.
  • Vehicles.
  • People.
  • Different terrain.
  • Bumps on the ground which can cause the person to fall.
  • Curbs.

This project would take approximately 26-50 weeks to complete and will approximately cost $81,922 – $152,122 (not including continuous development and updates).

 

Introduction:

As of 2016, 2.13% of New York residents are visually impaired. That number may not seem big however, that is equal to 418,500 people in New York that are visually impaired. We often take our eyes and the ability to see for granted, however, living while being visually impaired is extremely hard. Attempting to do everyday tasks without the ability to see would seem impossible to most, however, it is the normal for many New Yorkers. In this proposal, I would like to:

  • Discuss how Light Detection and Ranging (LiDAR) sensors work.
  • Discuss how using high definition cameras and LiDAR sensors can function together.
  • Discuss how new devices, such as the iPhone 12, iPad and other devices now include LiDAR sensors.
  • Show that utilizing the sensors on these devices can be used to detect and warn the user of obstacles that are in their way so they can be avoided.

It is crazy how today we have had so many technological advancements and still are having technological breakthroughs daily, however we can not make any new devices or machines that can help the visually impaired. It seems as though the visually inpaired have been neglected when it comes to technological advancements that can benefit them. Guide canes (which are the white canes used by the blind) came to the United States in the 1930s and they are still being used today. We have had technological advancements for almost everything that has been used in the 1930s except for gude canes.

 

Proposal:

In today’s world, small devices such as cell phones and tablets have become so powerful, they aren’t just used to make calls and text anymore, they are used in any day to day functions. Millions of people rely on their cell phones or tablets on a daily basis to conduct tasks, some may be simple and others may not be so simple. These devices have become so powerful now and I would like to use some of their fascinating technology to help those that are visually impaired.

Specifically I would like to focus on the iPhone 12 and the iPad Pro which now come with a LiDAR sensor, which stands for Light Detection and Ranging, equipped. The original purpose for these sensors is to be used to measure distances for the “measure” feature in those devices.

The iPhone 12 comes equipped with three 12- megapixel cameras and a LiDAR sensor (as seen in diagram 1). The LiDAR sensor in the iPhone 12 and the iPad Pro can reach up to 5 meters away from the device which is more than enough for the reason we are proposing today. The device sends out a series of light pulses in a spray of infrared dots, the device then measures each one with the censor. This leads the device to develop an image with a series of points that show the distance from the device (as seen in diagram 2).

The LiDAR sensors have their own illumination source. It sends out its own series of light pulses and measures the time it takes for those pulses to be reflected back. This allows the phone to be able to tell the distance of things in front of the device based on how long it took for the light to bounce back (as seen in diagram 3).

I would like to design an app for the iPhone 12 and the iPad pro on the App Store and for the Google Play Store for the Android devices that have LiDAR sensors equipped. This app will be used to detect walls, cars, people and other obstacles that are in front of the direction that the sensor is pointing. Once the sensor reflects back and detects any obstacle that is in front of it, it will then vibrate, different vibration intensities the closer the obstacle gets. Since the iPhone 12 and iPad Pro are equipped with LiDAR sensors that have a range of up to 5 meters, they will have a very slight vibration if the obstacle is 4-5 meters away, a slightly more intense vibration if the obstacle is 3-4 meters away, then it will intensify the vibration every from 2.5-3 meters and 2-2.5 meters. As obstacles get closer than 2 meters, the vibration will intensify every .25 meters until the final meter, and then every .1 meter from 0-1 meter.

The main issue that I saw when thinking of this idea is that the guide stick allows the user to feel changes in the terrain, for example, bumps, ditches, potholes etc. This is a really big issue especially in New York because the sidewalks and streets are riddled with these problems. To overcome that issue, I have also thought of making the device vibrate for any changes in terrain, like a bump in the street or a curb. The vibration will also intensify the closer the change of terrain gets.

This latest addition of terrain change has also raised another problem to the proposal. How will the user be able to tell the difference and identify if the vibration is caused by a change in terrain or an obstacle in front of them. This led me to have a feature within the app which will allow the user to customize their vibration pattern for obstacles in front of them and for changes in the ground in front of them. This is similar to each app having different vibrations when they send a notification. For example, most people can distinguish what type of notification they received while their phone is still in their pocket, that is due to different vibration patterns.

Diagram 1

 

Diagram 2

Diagram 3

 

Timing:

On average, it takes three to nine months to create and develop an app. Due to this app being somewhat straightforward with its functionality and its goal is already set, it should take about six to nine months to develop and test the app. I think testing the app will take much longer than actually developing it because it has a very sensitive function and needs to not have any bugs, glitches or any problems because that can lead to someone getting seriously injured.

 

Process Length of Time
6-12 weeks Designing the app
6-12 weeks Developing the app
2 weeks Deploy to the app store
12-24 weeks Testing the app
Ongoing Updating and improving the app

 

Approximately 26-50 weeks for steps 1 through 4

 

Budgeting:

 

Price ($) Part
999 MSRP iPhone 12 Pro
799-999 MSRP iPad pro
99 per year Apple app store
25 Google Play store
80,000-150,000 Write code and develop the app
??? Testing
??? Continuous development

 

Approximately $81,922 – $152,122

Addition of $99 per year as well as continuous app development cost.

 

Citations

Apple, apple.com/.

Brookes, Tim. “What Is LiDAR, and How Will It Work on the IPhone?” How, How-To Geek, 2 Nov. 2020, www.howtogeek.com/695823/what-is-lidar-and-how-will-it-work-on-the-iphone/.

“How Long Does It Take To Build An App?” 3 SIDED CUBE, 22 May 2019, 3sidedcube.com/how-long-does-it-take-to-build-an-app/#:~:text=On average, apps can take,brief: one or two weeks.

“How Much Does It Cost to Develop and Build an App.” UTILITY, utilitynyc.com/blog/app-development-cost.

“How Much Does It Cost to Publish an App on the App Store.” Appy Pie, www.appypie.com/faqs/how-much-does-it-cost-to-publish-an-app-on-the-app-store#:~:text=To publish your app on the Apple App Store, you,a cost to publish apps.

“Lidar.” Wikipedia, Wikimedia Foundation, 29 Nov. 2020, en.wikipedia.org/wiki/Lidar#Sensor.

Stein, Scott. “Lidar on the IPhone 12 Pro: What It Can Do Now, and Why It Matters for the Future.” CNET, CNET, 21 Nov. 2020, www.cnet.com/how-to/lidar-explainer-apple-iphone-12-pro-and-pro-max-what-it-can-do-now-why-it-matters/#:~:text=Some other smartphones measure depth,of a space and the.

Stein, Scott. “Trying Some of the LiDAR-Enabled AR Apps I Can Find for the 2020 IPad Pro, to Show Meshing. Here’s One Called Primer, an Early Build to Test Wallpaper on Walls Pic.twitter.com/SatibguyTm.” Twitter, Twitter, 14 Apr. 2020, twitter.com/jetscott/status/1250127483376148484?s=20.

 

WfE-Engineering-Proposal-2.docx