Written by Alex Gibson, on April 14, 2017


React Native is becoming widely popular. The premise that you can write one language and deploy to iOS and Android with a lot of reuse is very appealing.  I decided I would test React Native in an existing iOS application so that I could stay in my natural habitat but use the benefits of React Native.  There are some great tutorials but like many things they are not perfect and I found myself banging my head against a wall.  I am going to try to plug some of the holes and also demonstrate why React Native is so powerful and wonderful by creating a much needed reusable component, a custom camera that looks better than a standard UIImagePicker.

I am assuming you have at least played with React Native and that you have installed the needed sdks.  You will also need cocoapods installed as this is the recommended approach to install react native in your iOS App.  I would also recommend Visual Studio Code by Microsoft to edit your files on the React Native side of the code.  VSC has some great plugins to help with code completion and snippets so look through the plugins you can install as well. Walking through the tutorials provided by Facebook in the Tutorials are great.  You should have completed the tutorial for basic setup and have the app running with React Native and iOS.

We are going to build a camera that can be used with native iOS that can be shown in any view.  This means you can have a cool collectionview cell with a camera.  You could have a full screen camera.  Whatever you want.  And this camera will be added with React Native.  You can use this to skip building one for yourself in AVFoundation.  Thankfully we don’t have to reinvent the wheel as a camera that will work with iOS and Android in React Native has already been built. Yes I said iOS and Android so companies with apps on both platforms take note and try to build as much cross platform code as you can.  We are going to take the camera and style it to our needs. Most of the work we will be doing will be setup with very little code. I post the code several times so that you can see what changed. Let’s get started.
So first and foremost we will need to run it on a real device as simulators do not have cameras and we also need to add the camera usage information to the info.plist. Open the info.list as source code and paste the below items.


<key>NSCameraUsageDescription</key>
<string>${PRODUCT_NAME} Camera Usage</string>
<key>NSLocationWhenInUseUsageDescription</key>
<string></string>
<key>NSPhotoLibraryUsageDescription</key>
<string>${PRODUCT_NAME} PhotoLibrary Usage</string>

 

In the previous tutorial we added React and React Native.  We need to add the open source camera from GitHub.  In the package.json file in the js folder add the react native camera in the dependencies.


"react-native-camera": "~0.6"

Make sure to run npm install.  Now go to the podfile in the iOS folder and add the camera to the pods.


pod 'react-native-camera', path: '../js/node_modules/react-native-camera'

We are ready to start building a styled camera.  Create a file named StyledCamera.js inside the js folder.  We need to do the usual imports and in the render we need to return the Camera.

 


'use strict';
 
import React, { Component } from 'react';
 
import {
  AppRegistry,
  Dimensions,
  StyleSheet,
  View,
} from 'react-native';
 
import Camera from 'react-native-camera';
 
export default class StyledCamera extends Component {
 
  render() {
    return (
      <View style={styles.container}>
        <Camera
          ref={(cam) => {
            this.camera = cam;
          }}
          style={styles.preview}
          aspect={Camera.constants.Aspect.fill}
          onZoomChanged={(event)=>{var{velo,zoomFactor}=event.nativeEvent;
          console.log('we have a zoom')}}
          onFocusChanged={(event)=>{var{x,y}=event.nativeEvent;
          console.log('we have a touch')}}>
        </Camera>
      </View>
    );
  }
}
 
const styles = StyleSheet.create({
 
  container: {
    flex: 1
  },
  preview: {
    flex: 1,
    justifyContent: 'flex-end',
    alignItems: 'center',
  }
});

 

So let’s go over this code for our Camera.  First the usual imports but this time we import the Camera that we installed with the npm package at the very beginning. Then in the render we wrap the Camera with a view and show it.  The Camera has properties that we can watch and change.  You can see I am logging zoom change and focus change but we could call a function as well.  I also set a ref on the Camera so that we can call actions on the Camera later. Much like self.camera we will be able to get the camera with the ref.

 

Now return to the index.ios.js file and remove the Text and View import from the react native imports and add an import that brings in our StyledCamera and return that in the render function.

 


'use strict';
 
import {
 
  AppRegistry,
  StyleSheet,
} from 'react-native';
 
import React, { Component } from 'react';
import StyledCamera from './StyledCamera.js';
 
export default class RNView extends Component {
 
    render() {
        return(
            <StyledCamera/>
        )
    }
}
 
AppRegistry.registerComponent('RNView', () => RNView);

Make sure to hit command S on both files StyledCamera.js and index.ios.js to save them.  All we did here was remove the test view that we had with text and we are loading up our Camera from the StyledCamera file.

Now plug a real iOS device up to your computer and run the app.

Failure! Failure! Failure!  Now this puzzled me for quite a while and I think I was ready to break a computer before I finally figured it out.  So your phone and your computer are trying to connect with each other and in my case I was hooked to a wireless router so over Wifi. Open terminal and type

**If for some reason yours loaded skip this step.**


ipconfig getifaddr en0

This will return your computer’s address on the network.  Now navigate to your ViewController file and replace localhost with that address.  My viewDidLoad now looks like this.

 


override func viewDidLoad() { 
      super.viewDidLoad() 
      rnCamera = RCTRootView(bundleURL: URL(string: "https://192.168.1.8:8081/index.ios.bundle? 
                                               platform=ios"), moduleName: "RNView", initialProperties: nil,                                                         launchOptions: nil) 
rnCamera.frame = self.view.bounds       
rnCamera.autoresizingMask = [.flexibleWidth,.flexibleHeight]  
self.view.addSubview(rnCamera)
}

Now rerun and you should get prompted by the system for permissions and now see the camera.

And finally here we are.  I completely blank and beautiful camera to be styled quickly anyway we want.

So shake your device.  Enable Hot Reloading and let’s get to work.We definitely need a camera button.  I want it to be circular so at the top of the Styled Camera File directly under the imports I am going to set the width/height of our button.


//this will be the width and height of our button for the camera
const buttonWidth = 80;

So in between the Camera open and close tags I am going to add a view with a white border.  Your render should now look like this.’


render() {
 
    return (
      <View style={styles.container}>
        <Camera
          ref={(cam) => {
            this.camera = cam;
          }}
          style={styles.preview}
          aspect={Camera.constants.Aspect.fill}
          onZoomChanged={(event)=>{var{velo,zoomFactor}=event.nativeEvent;
          console.log('we have a zoom')}}
          onFocusChanged={(event)=>{var{x,y}=event.nativeEvent;
          console.log('we have a touch')}}>

          <View style={{width:buttonWidth,height:buttonWidth,borderRadius:buttonWidth/2,backgroundColor:'transparent',alignItems:'center',justifyContent:'center',borderColor:'white',borderWidth:4,marginBottom:20}}>

          </View>
        </Camera>
      </View>
    );
  }

Hit command S and you should now see the circle on your device.

So let’s quickly see what we did.  We added a view and set the styles to be transparent background with a white border. Now I am going to add another view that will be a solid white circle inside this view and I want it to be touchable. So  at the top at the imports add TouchableWithoutFeedback.  Your React Native import should now look like this.


import { AppRegistry, Dimensions, StyleSheet, TouchableWithoutFeedback, View, } from 'react-native';

Why TouchableWithoutFeedback? We will get to that later as I have plans to jazz this up.  Let’s also add a method that will take a picture. Add this just above the render function.  Basically this is almost a copy from the GitHub page for the Camera just to keep things simple.

 


takePicture() {
      //this will make the camera take a picture and save it to the photo library
      // or it will give us an error
      this.camera.capture()
        .then((imageInfo) => console.log(imageInfo))
        .catch(err => console.error(err));
    }

 


<TouchableWithoutFeedback
onPress={this.takePicture.bind(this)}>
       <View style={{borderRadius:1000,backgroundColor:'white',padding:20}}/>
</TouchableWithoutFeedback>

Your render should now look like this.


render() {
    return (
      <View style={styles.container}>
        <Camera
          ref={(cam) => {
            this.camera = cam;
          }}
          style={styles.preview}
          aspect={Camera.constants.Aspect.fill}
          onZoomChanged={(event)=>{var{velo,zoomFactor}=event.nativeEvent;
          console.log('we have a zoom')}}
          onFocusChanged={(event)=>{var{x,y}=event.nativeEvent;
          console.log('we have a touch')}}>

          <View style={{width:buttonWidth,height:buttonWidth,borderRadius:buttonWidth/2,backgroundColor:'transparent',alignItems:'center',justifyContent:'center',borderColor:'white',borderWidth:4,marginBottom:20}}>
              <TouchableWithoutFeedback
                          onPress={this.takePicture.bind(this)}>
                         <View style={{borderRadius:1000,backgroundColor:'white',padding:20}}/>
              </TouchableWithoutFeedback>
          </View>
        </Camera>
      </View>
    );
  }

Now hit Command S and you should see a solid center circle show up.  Tap the inner circle to take a picture and then navigate to your photo album to make sure it worked.  You should now have an image taken from your camera.  Also a note, sometimes going out of the app will cause the development server to stop.  Just rerun the app from Xcode.

For those that know me know that I like animations and our button needs some feedback. We could highlight the inner circle to give the user feedback that a picture is occurring but let’s add an animation before we jump back to our iOS project code.  Add the Animated and Easing imports to React Native.  Your import should now look like this.


import {
  AppRegistry,
  Dimensions,
  StyleSheet,
  View,
  TouchableWithoutFeedback,
  Animated,
  Easing,
} from 'react-native';

Add the constructor function at the top of the file and set a scale value of 1 that we will later animated.


constructor(props){ 
    super(props); 
    this.state = { bounceValue: new Animated.Value(1), }; 
}

Here I am just creating a bounce value that will be stored in state and setting it to 1.  We will use this to animate our small solid circle view.

Now in the render of our small button change it from a View to Animated.View and add a transform style to button. Your render will now look like this. In the first load it will use the value of 1 from the constructor.


render() {
    return (
      <View style={styles.container}>
        <Camera
          ref={(cam) => {
            this.camera = cam;
          }}
          style={styles.preview}
          aspect={Camera.constants.Aspect.fill}
          onZoomChanged={(event)=>{var{velo,zoomFactor}=event.nativeEvent;
          console.log('we have a zoom')}}
          onFocusChanged={(event)=>{var{x,y}=event.nativeEvent;
          console.log('we have a touch')}}>

          <View style={{width:buttonWidth,height:buttonWidth,borderRadius:buttonWidth/2,backgroundColor:'transparent',alignItems:'center',justifyContent:'center',borderColor:'white',borderWidth:4,marginBottom:20}}>
              <TouchableWithoutFeedback
                          onPress={this.takePicture.bind(this)}>
                         <Animated.View style={{borderRadius:1000,backgroundColor:'white',padding:20,transform: [                        // `transform` is an ordered array
                                          {scale: this.state.bounceValue},  // Map `bounceValue` to `scale`
                                          ]}}/>
              </TouchableWithoutFeedback>
          </View>
        </Camera>
      </View>
    );
  }

At the moment no animations will occur but we will add that in takePicture() so that the user will have feedback. Add an animation to take picture and a bounce back to spring back to a scale of 1 in a separate method below takePicture() and a bounceBack() spring function just below takePicture(). They will look like this.

 


takePicture() {
      //this will make the camera take a picture and save it to the photo library
      // or it will give us an error
      this.camera.capture()
        .then((imageInfo) => console.log(imageInfo))
        .catch(err => console.error(err));

        Animated.timing(
            this.state.bounceValue,
            {
              toValue: 0.8,
              duration: 225,
              easing: Easing.in
            }
          ).start(() =>  this.bounceBack()
        )
    }

  bounceBack() {
      Animated.spring(
        this.state.bounceValue,
        {
          toValue: 1,
          friction: 10,
        }
      ).start();
  }

The way these two animations work is different. One  is running on a constant timer of 225 milliseconds and the other is using spring.  When they are running the render function is called and this how our view will scale.
Hit Command S to save our file. Command S and let’s press our button.  It should look like this.

Animation

We are almost finished with the React Native. Let’s pass the image back to our iOS app.  How do we do that? Well luckily we can do it pretty easy. Looking over the docs of the React Native Camera on GitHub it appears that one of the image info options in the promise is a file path for the image. So with that knowledge it is time to create the code in our iOS native application to be called from the javascript world.  First go to the Xcode menu and File->New->File and choose CocoaTouch Class.

Make it a Swift file and a subclass of NSObject.  Let’s name it RNCameraManager.  Now lets write  a method to accept the file path that we can get from our react native camera image info.  Because of how the methods are bridged we need to make objc style class.  Our file will look like this.

 


import UIKit
import React
@objc (RNCameraManager)
class RNCameraManager: NSObject { 
var bridge: RCTBridge!
@objc func cameraDidTakePicture(_ imageInfo:NSDictionary,reactTag: NSNumber){      } 
}

Because we still need to bridge in Objective C we can create an empty Objective C file that will work with our Swift file.  Go to the File-> New -> File and choose Objective C file with the m icon. Name it the same thing as your Swift file RNCameraManager.  Say no to creating the Bridging Header but if you created it no worries but we will not need it. Add the following code to the new Objective C file.

 


#import <Foundation/Foundation.h>
#import <React/RCTBridgeModule.h>
@interface RCT_EXTERN_MODULE(RNCameraManager, NSObject)
RCT_EXTERN_METHOD(cameraDidTakePicture:(nonnull NSDictionary *)imageInfo reactTag:(nonnull NSNumber *)reactTag)
@end

Basically this code is what is actually called in javascript and forwarded on through your Swift file.  We are finished with the Objective C so back to the Swift file and some finishing touches.

We need to get the path out of the dictionary of information and use the photos framework to retrieve the photo.  So import Photos just under the import for React.  Now inside the function lets unwrap the dictionary and file path and retrieve the asset while also getting on the main thread.  The code we are adding will look like this.

 


if let imagePath = imageInfo.value(forKey: "path") as? String{
              DispatchQueue.main.async {
                   //target ios 8 + with photos framework
                   let options = PHFetchOptions()
                   options.fetchLimit = 1
                  let assets = PHAsset.fetchAssets(withALAssetURLs: [URL(string:imagePath)!],
                                                    options: options)
                  if let view = self.bridge.uiManager.view(forReactTag: reactTag) {
                         if let presentedViewController = view.reactViewController() as? ViewController{
                                 //to do code
                         }
                  }
             }
}

This code is pretty straight forward.  First knowing that the path attribute of the dictionary will give us the image file path we use the photo framework to get our assets that match the url.  In this case we are only passing our new path URL from React Native side to the swift file and finding the corresponding photos asset.  Then we call the bridge manager and we can get the viewController that is controller our React Native Camera.  Then we can call any action we want from segues to anything else you might want on that viewcontroller.

Navigate to the storyboard and drag a new viewcontoller onto the storyboard.    Now create a segue from our current viewcontroller to the new viewcontroller.  Choose present modally.

Give the segue an identifier of toImageDetail.

Add a UIImageView to the new controller and pin it 0 on all sides.  Then choose update frames.

 

Add a UIButton to the top right and pin it 20 to and 20 right. Change the text to close. Update your frames choosing the yellow triangle.

 

We need to make a view controller file.

Go to our top menu and choose File -> New -> File and choose CocoaTouch Class and choose a Subclass of UIViewController and not being very creative at the moment we will name it ImageDetailViewController.  We want to show and image that we pass and dismiss when the close button is pressed.  Here is what the controller will look like.

 


import UIKit

class ImageDetailViewController: UIViewController {

      @IBOutlet weak var imgView: UIImageView!
      var image : UIImage?

         override func viewDidLoad() {
                 super.viewDidLoad()
          }

         override func viewWillAppear(_ animated: Bool) {
                  super.viewWillAppear(animated)
                  imgView.image = image
          }

        @IBAction func closeDidPress(_ sender: Any) {
                 self.dismiss(animated: true, completion: nil)
        }
}

Now back to our main view controller. First  let’s add an identiifer to our React Native view in the passed props. We are going to add our method to pass the image from the RNCameraManager and perform the segue. Notice we also tag the view in the viewDidLoad.  Specifically we are tagging the view with an int.  We are also changing the RNView to RNCamera, in the module name, and you will see that I clean up the index.ios.js file further in the article but rename it now in the ViewController.



import UIKit
import React

class ViewController: UIViewController {
           var rnCamera:RCTRootView!
           override func viewDidLoad() {
                   super.viewDidLoad()
                    rnCamera = RCTRootView(bundleURL: URL(string: "https://192.168.1.27:8081/index.ios.bundle?platform=ios"),
                                           moduleName: "RNCamera",
                                           initialProperties: ["identifier":2],
                                           launchOptions: nil)
                   rnCamera.frame = self.view.bounds
                   rnCamera.autoresizingMask = [.flexibleWidth,.flexibleHeight]
                   self.view.addSubview(rnCamera)
           }

 
          //will be called from RNCameraManager
          func segueWith(image:UIImage){
                 self.performSegue(withIdentifier: "toImageDetail", sender: image)
          }

 

          override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
                if segue.identifier == "toImageDetail"{
                     guard let dvc = segue.destination as? ImageDetailViewController else{return}
                     dvc.image = sender as? UIImage
                     print("segue")
          }

}

Back to our RNCameraManager to finish up. Add a function to retrieve the PHAsset(getAssetThumbnail()) and call the function to segue with our image.  Your file should now look like this and be complete. We will be passing the url and the tag back from React Native.


import UIKit
import React
import Photos

@objc (RNCameraManager)
class RNCameraManager: NSObject {

      var bridge: RCTBridge!
      @objc func cameraDidTakePicture(_ imageInfo:NSDictionary,reactTag: NSNumber){
                 //path key is file path provided as info from React Native Camera image capture
                 if let imagePath = imageInfo.value(forKey: "path") as? String{

                 DispatchQueue.main.async {
                       //target ios 8 + with photos framework
                       let options = PHFetchOptions()
                       options.fetchLimit = 1
                       let assets = PHAsset.fetchAssets(withALAssetURLs: [URL(string:imagePath)!], options: options)
                       if let view = self.bridge.uiManager.view(forReactTag: reactTag) {
                              if let presentedViewController = view.reactViewController() as? ViewController{
                                 presentedViewController.segueWith(image: self.getAssetThumbnail(asset: assets.firstObject!,
                                                                   targetSize: presentedViewController.view.bounds.size))

                               }

                        }

                }

              }
      }

     func getAssetThumbnail(asset: PHAsset,targetSize:CGSize) -> UIImage {
               let manager = PHImageManager.default()
               let option = PHImageRequestOptions()
               var thumbnail = UIImage()
               //very important or image would return nil before completed
               option.isSynchronous = true
               manager.requestImage(for: asset, targetSize: targetSize, contentMode: .aspectFit, options: option, 
                                    resultHandler:             {(result, info)->Void in
                      thumbnail = result!
                })

                return thumbnail
      }
}

In real Swift style we should not force unwrap but you can handle the fixes on your own.  Now back to our javascript. We  are going to make a couple of final changes and we will be finished.

Open index.ios.js. Since all we are doing is returning our componen we can shorten the code to just.


'use strict';
 
import {
  AppRegistry,
} from 'react-native';
import React, { Component } from 'react';
import StyledCamera from './StyledCamera.js';
 
AppRegistry.registerComponent('RNCamera', () => StyledCamera);

 

Open StyledCamera.js.  We need to add the identifier we are passing in from the controller(identifier:2) to the state. So the constructor will now look like this.


constructor(props){
    super(props);
    console.log("The identifier is",props.identifier)
    this.state = {
      bounceValue: new Animated.Value(1),
      identifier : props.identifier,
    };
  }

Think of state as one magical dictionary if you are coming from iOS.  We need to fix the imports and we are going to declare our RNCameraManager from the iOS world. We also need to add NativeModules to our React Native imports.  This will now look like the code below.


import {
  AppRegistry,
  Dimensions,
  StyleSheet,
  View,
  TouchableWithoutFeedback,
  Animated,
  Easing,
  NativeModules,
} from 'react-native';

const { RNCameraManager } = NativeModules;

We can call our function from the RNCameraManager in our native application now and pass the camera image info back.  takePicture() will now look like this.


takePicture() {
      // this will make the camera take a picture and save it to the photo library
      // or it will give us an error
       Animated.timing(
            this.state.bounceValue,
            {
              toValue: 0.8,
              duration: 225,
              easing: Easing.in
            }
          ).start(() =>  this.bounceBack()
        )
      this.camera.capture()
        .then((imageInfo) =>  {RNCameraManager.cameraDidTakePicture(imageInfo,this.state.identifier);})
        .catch(err => console.error(err));
    }
Be sure to hit Command S to save and go to Xcode and rerun your project.  Whenever we update stuff in the native application we have to rerun and cannot use hot reloading but if you only update js code you can see results immediately.
Now you should see your camera and if you snap a photo you should segue to the image detail. What is awesome about this is you can style the camera easily with flex box and I challenge you to do so. You can also have this view in collectionview cells and much more. I challenge you to add props that would show or hide buttons the same way we passed the identifier. Add methods for touching the entire camera view to take a picture.  Test yourself.
I am attaching the completed project on the Apptillery GitHub.  I will cover adding the js file to the main bundle in a future tutorial.