This weekend we set out to build a color picker for real life. Grab your phone, point it at a thing, find that thing’s RGB color.
Not sure why, but it’s a fun little way to play with React Native and device cameras. We found a package called react-native-camera that makes camera integration easy.
Import the project, and use it to render a live view of the camera in your app. That was easier than I thought it would be. Guess that’s the benefit of being late to the game: others have already built the things that are hard. Now you just have to put them together like LEGO pieces in novel ways.
That’s the main benefit of React Native, isn’t it? Components that do stuff and fit together. Just write a little glue code, and everything fits.
My favorite discovery was Quicktime’s movie recording mode. You can show your phone screen on your main screen.
No more this:
Isn’t that better? I think it is. It’s going to improve future my livecoding sessions and React Native School videos.
Hm, this isn’t a very long livecoding recap. Could we really have done so little? Was I that wasted from running a lot on Sunday morning?
And we discovered that even if you’re saving photos temporarily, you still need Photo Library permissions. Forget to ask, and your app will crash. That part was confusing.
But that’s basically it, darn. Next time I should drink more caffeine.
Here is next day’s follow-up session. That was fun too, we learned that my computer is crazy slow.