Extending App Functionality with React Native Camera

June 24, 2022 • 1290 Views

author photo

Tetiana Stoyko

CTO

React Native mobile app development is a very complex process. For instance, such an application can be a standalone product, as well as an addition to the already existing brand representations like a website providing the same features. Previously, we have already discussed what format of the mobile apps to choose: whether it will be a mobile website or a mobile application.

Obviously, it is not the only decision to make. In fact, it is just the beginning of the development process, which consists of multiple variations and possible choices. For instance, you have to decide what software and frameworks to use. Also, you need to create a list of app features, that can help to satisfy your needs. Finally, to shape how will the product look like and what UI elements should be used.

All of these mentioned processes are just a small part of the planning and development procedures. Clearly, we can talk about all the possibilities for hours but instead of imagining it is better to consider the example.

Apparently, visualization is the best solution possible, because it can help to avoid misunderstandings and misconceptions that can and probably occur if you will try to fantasize about the options and possible results of a future project, without a basic understanding of the IT-sphere.

This is why, we propose to consider an abstract case, in order to see what developers can and can’t do with specific frameworks.

Tech Task Formation

Let’s suppose, we are creating an application for an online store. It must provide a virtual showcases option. Customers stated that performance and user experience are very important to them.

Moreover, they require extended app functionality: it must work as a virtual showcase for the customer, but the shop assistants should have the possibility to update the data. For instance, they need an opportunity to take photos of the available goods and upload them, so that customers know what is now in the store.

Unquestionably, we are dealing with a native application, with extended app functionality that allows taking photos with an app. And clearly, we are not going to disassemble and explain the development process from the very beginning to the end. First of all, it will take too much time. Secondly, it will require a lot of coding and explanation. This is why we propose to take a close look at one of the app features and explain it in detail.

In other words, let’s slightly change the tech task: the application was already developed using the React Native framework exclusively for iOS. The previous development team enabled almost all required app features and the “taking photos with an app” is the last UI element to implement. So how to make it possible?

React Native Camera Component

There are at least two ways to enable such app functionality as taking photos with an app, that is based on React Native:

  1. The hard one. In this case, the developer decides to code everything by himself. Eventually, it can take hundreds of strings of code. Of course, it will show how skilled the developer is. Nevertheless, such an approach is a great choice to practice coding skills or it may be used in some very rare circumstances when the second is simply impossible to be used. In other words, it is highly recommended not to use this method without urgent needs.

  2. The smart one. Actually, React Native framework allows using various add-ons and libraries. This significantly simplifies the developing process, allowing to skip a lot of codings. In our case, there is a great library for react native mobile apps. The React Native Vision Camera is an open-source library, that is used to easily connect the camera component, avoiding “the hard way”, mentioned before.

Actually, there are several React Native libraries, that are used to support such app functionality as the camera display. We are going to use the react native vision camera. Despite the fact, that there is a react native camera, we are referring to the “vision” one because the usual RNC is deprecated in its favor.

The preparatory stage is done, let’s do some coding!

Technical Part

Preparation Stage

First of all, we need to add the library:

import { Camera, useCameraDevices } from "react-native-vision-camera";

Now, we have imported library components, which we can use to enable the “taking photos with an app” feature. In this specific case:

  • ‘Camera’ - is the class, that is doing photos;
  • ‘useCameraDevices’ - is our main function. It allows getting access to the information of available device camera components and to choose the needed one.

So, the next step is to choose the needed camera:

const devices = useCameraDevices("wide-angle-camera");

We choose the regular format camera by the foregoing string. Knowing the fact, that we are developing a React Native mobile app, we should also consider that most modern mobile devices have at least 2 cameras: the front (for selfies) and the back (regular) one. This is why we will probably need also to state which one we need. The obvious choice is the regular camera:

const device = devices.back;

Now, when we defined the format of the photo and the camera we are using, it is time to create a reference(‘ref’) of this camera. This will help us to handle all the needed related processes later and identify the needed device:

const camera = useRef<Camera>(null);

The last important preparation process we have to do, before starting the main coding, is to markup this camera in a way, so it will be able to show the live camera display inside our application:

<Camera
       ref={camera}
       style={StyleSheet.absoluteFill}
       device={device}
       isActive={true}
       photo={true}
/>
  • Now it accepts the reference (the first string with ‘ref’);
  • ‘Style’ defines the view of the shown image. In our case, ‘absoluteFill’ means that the camera display picture will take up all the space in the offered camera display block;
  • ‘Device’ identifies this camera;
  • ‘isActive’ is the property of the device. In other words, it checks if the camera is live;
  • ‘photo’ - states that it takes photos. Additionally, the video feature can be added as well, but we don’t need it, therefore we will skip it.

Camera Component Implementation

Finally, we can enable “taking photos with an app”:

if (camera?.current) {
const photo = await camera.current.takePhoto({
       flash: "on",
       qualityPrioritization: "speed",
       skipMetadata: true,
     });
     if (photo?.path) {
       const imagePath =
         Platform.OS === "ios" ? photo?.path : `file://${photo?.path}`;

Obviously, first of all, we have to check if the needed camera exists and works. If everything is ok, then we call the function ‘takePhoto’, which is provided by the React Native Camera Library.

Next, we set the parameters of the action:

  • ‘Flash’ string activates the flashlight when the photo is being taken; -‘qualityPrioritization’ makes the speed of taking photos a priority. In fact, we can also make the quality of the photo a priority, but in such circumstances, it can overload the backend processes;
  • ‘skipMetadata’ allows us to avoid saving the metadata of each photo because we are not interested in it;

Finally, we work with the result (the photo). If everything worked as intended and we got a photo, we have to add the path for it, so we can save it into the database. Also, we may edit the path, depending on the platform(Android and iOS have some differences) - ‘Platform.OS’ (In this particular situation, the path is set for the iOS platform).

End Line

Clearly, the foregoing code is just a limited sample. If you will try to combine all mentioned pieces of code - you will get no result. To make this sample work, you will need to feel some additional code parts.

Nevertheless, the fact, that you cannot use the sample as prewritten code for extending react native mobile app functionality doesn’t mean that it is not working. The purpose of this text is to briefly explain how to enable using the camera component in apps, based on the React Native framework.

In the end, we can summarize that, obviously, there are various ways how to extend the app functionality with various UI elements like camera display. On the one hand, you can write the code fully manually, avoiding the usage of various libraries and add-ons.

Nevertheless, if you are not an old-school fan, but a comfort enjoyer - you may like using libraries. Clearly, they are not the answer to any question and you still have to code and understand what you are doing. However, it is still a great helping tool that can make your life easier.

Moreover, there are plenty of various libraries, among which you can find the one that fits the best. For instance, they can exist as a full-fledged alternative to each other or as deep improvements. In our case, React Native Vision camera is practically the same as the older React Native camera library, which is even stated on its GitHub page.

In case, if you are looking for an experienced development team, that can develop your project from scratch or simply update an existing one - contact us. Our Incora developers are experienced professionals, who work in the field of Frontend, Backend, Database technologies, Cloud services, and DevOps tools.

Share this post

Tags

Tech
Guide
Case Studies

What’s your impression after reading this?

Love it!

Valuable

Exciting

Unsatisfied

Got no clue where to start? Why don’t we discuss your idea?

Let’s talk!

Contact us

chat photo
privacy policy

© 2015-2023 Incora LLC

offices

Ukrainian office

116, Bohdana Khmel'nyts'koho, Lviv, Lviv Oblast, 79019

USA office

16192 Coastal Hwy, Lewes, DE 19958 USA

follow us

This site uses cookies to improve your user experience.Read our Privacy Policy

Accept