If you're making an Android Camera application, it's quite useful to know, or display "what the camera is seeing".
There is some documentation on this subject, but getting it right can be quite tricky, especially getting it right on the numerous different types of devices out there.
First, there's a few things to understand what's going on:
- The old Android Camera API was deprecated in level 21
- The camera hardware/software and the Android software are two independent pieces, and thus must be lined up accordingly for predictable behavior
- You can cheat problem #1 by checking for Camera2API usability and then correcting for problem #2 since the new API gives a default orientation metadata tag.
Here's my fork of Google's mediaRecorder sample which is compatible with both Nexus and <23 phones, with a separate branch using a TextureView over a SurfaceView.
There exists many other helpful resources out there: