Introducing Chroma

At Epilog we use our iPhone cameras to take dozens of pictures a day. However, these photos have a big problem: they can’t capture moments exactly like your eyes see them. Ever tried to zoom in on an iPhone photo or take a picture of a sunset? It’s not great.

Cameras today feel like the next iteration of flip phone cameras, we’ll look back at them and wish we could’ve saved our memories a little better.

Chroma is different. It’s a new way to build cameras, with the singular goal of capturing a scene exactly how it looks in real life.

How does it work??

As you may know, a camera focuses light though a lens onto a film or digital image plane.

Because digital image sensors are typically tiny (the size of your pinky fingernail) the image is squeezed and loses detail.

Chroma is better. Chroma uses multiple image planes, each capturing a different part of the image.

We use the same image sensors, but spread the image out onto a much larger area. It’s like using a net to catch fish versus sewing multiple nets together to catch much more.

Seems obvious, right? Well, we patented it. 🤗

What’s next

We’re putting Chroma in self-driving cars. One of the biggest problems in the self-driving world is that cars can’t “see” their surroundings. Tesla Autopilot uses cameras that are less than HD resolution. That’s like trusting your grandma to drive you without wearing her glasses.

Epilog has changed the game. We’re recognizing stop signs and stop lights, avoiding obstacles in the road, and increasing safety in heavy rain. It’s like a professional chauffeur is driving. We’re bringing Chroma technology to market as Sherpa in partnership with Jabil, the best optics manufacturer on the planet.

❤️ the Epilog team