[...]" />

3 Expectations for WWDC 2016 [Updated]

Hey Tim,
We’ve narrowed our list of WWDC expectations down to three. Just three little ones.

1. Pier to pier payments

This is a no-brainer, and how it’s taken this long baffles us.

We know it’s not easy. Nothing that’s built for a billion devices is. And we realize Apple still hasn’t fully grasped services yet. And we know that holding customer money for long periods is going to hit some serious regulations.

But you’ve had accounts for years. What was needed, functionally, was a way for you to store money in an account. Once it’s stored, your customers can decide if they want to put the money back onto their credit cards (for some sort of processing a fee if necessary), move it around to other customers, or spend it on Apple products. We don’t need to link it to a checking account. Yeah, that would be nice, but it’s not necessary.

And what a great way for parents to give their kids an allowance. Or friends to pay off gambling debts. And once the money’s sitting there, it’ll be much easier to spend on Apple stuff.

Paypal’s had this service for years. With passwords. It’s not rocket science.

So come on. What’s taken so long?

2. Street view maps

But not just street view.

You’ve got flyover data, satellite data and now street view data. Put them together already.

We expect your 3D rendering to seamlessly meld with the street view. From satellite down to the street and back up to flyover without a bump.

That will be cool.

And that’s not to mention a much updated business list. Yours is poor. Why we can’t see almost all business names when we maximize our zoom is beyond us. It’s not a space issue. And we know the businesses exist, because we can search on a specific business and it’ll pop up on the map.

All we can think is that it’s about Apple making money and not about Apple providing a service.

But that’s why you’re still lagging at services. Which is a separate topic. Sorry to diverge.

3. Indoor mapping and a new HomeKit app and a secure enclave in the AppleTV (or new Echo-like device we’ll call AppleVoice) where our private data is stored.

That’s not too much to ask.

Here’s what we, the simple folk that we are, see.

You’ve got three primary technologies to help with interior iPhone location: iBeacons, Bluetooth and wi-fi. At least one of these must exist in an independent device in the home as a location reference. Throw in GPS data for added support. The more data the merrier in terms of accuracy.

Then you’ve got the camera, and soon hopefully, new cameras with new capabilities. But already, you can derive some distance data from the focus. Any laser or dual lens added to the camera will again just make the information more accurate.

And then you’ve got the new HomeKit app which is a central repository for all home-connected devices. It’s going to be cool. With centralized commands and a visual map, we’ll be able to both visualize and control.

How the mapping works:

We put our phone in map mode. It turns on the camera to take a video. We walk around our home, apartment, office, or whatever, point our phone up and around and have your software collect all sorts of mapping data. The more we move our phone around (not wave) and the more different views we get of any specific piece of furniture, the more accurate the data. We can also use our voice to tell the video, ‘living room’ or ‘Teddy’s bedroom.’ Tie that to the coordinate locations in the house and you’ve got a basic HomeKit map. But, with your 3D rendering software, the mapping tool can identify lamps and thermostats and speakers for use by the HomeKit app.

So then the rendering engine runs against the video and a digital map is made. We plug this into the AppleTV secure enclave for our own personal use and start cleaning up.

For instance, if we happen to have a lamp shaped like a crane, it might not be easily identified as a lamp.

No worries.

We can now pull up the map on our nice wide-screen and clear up any potential mis-mapping that may have occurred. If we’re deaf, we can type, or if we’re not we can use Siri and our finger. We can touch the crane lamp and say, hey, that’s the corner lamp in Teddy’s bedroom and it’s using the Phillips connected bulb.

Then, once we’ve cleared the map of all of the misidentified fixtures, we’ll be able to tell Siri, Hey, Siri, turn on the corner lamp in Teddy’s room. Or, Hey, Siri, turn up the heat in the living room, we’re having a tiki party. And HomeKit will know and Siri will order us a case of coconut rum.

As we said, Tim, we’re simple folk and there are a number of ways you could set up this mapping thing. But we know you and Apple have come up with something pretty sweet. Right?

So those are the three simple things.

Thanks, Mr. Cook. We’re excited for Monday.

[Update 60-13-16]
Boy were we off. 0.5 out of three. We sure hope you’re waiting for the new dual lens phones for indoor mapping. Disappointing.