Making a camera app using UNO....

Trying to make a camera app using UNO but im getting these errors:

  code.uno(72,37): E3114: There is no identifier named 'UIApplicationDelegate' accessible in this scope. 
  code.uno(74,17): E3114: There is no identifier named 'UIImageView' accessible in this scope. 
  code.uno(75,3): E3114: There is no identifier named 'UIViewController' accessible in this scope. 
  code.uno(76,3): E3114: There is no identifier named 'AVCaptureSession' accessible in this scope. 
  code.uno(77,3): E3114: There is no identifier named 'OutputRecorder' accessible in this scope. 
  code.uno(78,3): E3114: There is no identifier named 'DispatchQueue' accessible in this scope. 
  code.uno(80,43): E3114: There is no identifier named 'UIApplication' accessible in this scope. 
  code.uno(80,62): E3114: There is no identifier named 'NSDictionary' accessible in this scope. 
  code.uno(75,3): E3114: There is no identifier named 'UIViewController' accessible in this scope. 
  code.uno(76,3): E3114: There is no identifier named 'AVCaptureSession' accessible in this scope. 
  code.uno(77,3): E3114: There is no identifier named 'OutputRecorder' accessible in this scope. 
  code.uno(78,3): E3114: There is no identifier named 'DispatchQueue' accessible in this scope.

using Uno;
using Uno.Collections;
using Fuse;
using Fuse.Scripting;
using Fuse.Reactive;

using Android.android.media;
using Android.android.app;
using iOS.AudioToolbox;

using iOS.CoreGraphics;
using iOS.Foundation;
using iOS.UIKit;
using iOS.AVFoundation;
using iOS.CoreVideo;
using iOS.CoreMedia;
using iOS.CoreFoundation;

public partial class AppDelegate : UIApplicationDelegate
{
    public static UIImageView ImageView;
    UIViewController vc;
    AVCaptureSession session;
    OutputRecorder outputRecorder;
    DispatchQueue queue;

    public override bool FinishedLaunching (UIApplication app, NSDictionary options)
    {
        ImageView = new UIImageView (new CGRect (20, 20, 280, 280));
        ImageView.ContentMode = UIViewContentMode.ScaleAspectFill;

        vc = new UIViewController ();
        vc.View = ImageView;
        window.RootViewController = vc;

        window.MakeKeyAndVisible ();
        window.BackgroundColor = UIColor.Black;

        if (!SetupCaptureSession ())
            window.AddSubview (new UILabel (new CGRect (20, 20, 200, 60)) { Text = "No input device" });

        return true;
    }


}

https://developer.apple.com/library/ios/qa/qa1702/_index.html

To be honest i’m not sure where to start…

Hi Kevin!

To use iOS specific classes, make sure that you are building for iOS and that you reference the Experimental.iOS package.

Fuse handles the application lifetime for you, so there’s no need to create an instance of UIApplicationDelegate. It looks like you just want to run some code at startup, which can be done in the App’s constructor, as shown here: https://www.fusetools.com/learn/guides/uno.

It might help to browse the Experimental.iOS package a bit before using it — you’re making certain assumptions about the classes that don’t quite match what’s there. (As an example, the UIApplicationDelegate is called IUIApplicationDelegate in our bindings.)

Hope that helps! :slight_smile:

oky so here is where i got to :

  extern (iOS) class camview3: UIViewController, IUIImagePickerControllerDelegate, IUINavigationControllerDelegate{

  captureSession : iOS.AVFoundation.AVCaptureSession();
  stillImageOutput : iOS.AVFoundation.AVCaptureStillImageOutput();
  previewLayer : iOS.AVFoundation.AVCaptureVideoPreviewLayer();


  cameraView : UIView();

  override Function viewDidLoad(){
      super.viewDidLoad();
  } 


  override Function didReceiveMemoryWarning(){
      super.didReceiveMemoryWarning();
  } 

  override Function viewDidAppear(){
      super.viewDidAppear(animated);

      previewLayer.frame = cameraView.bounds();

  }

  override Function viewWillAppear(){
      super.viewWillAppear();

      captureSession = AVCaptureSession();
      captureSession.sessionPreset = AVCaptureSessionPreset1920x1080();


      var backCamera = iOS.AVFoundation.AVCaptureDevice._defaultDeviceWithMediaType(AVMediaTypeVideo);

      var error = NSError();
      var input = iOS.AVFoundation.AVCaptureDeviceInput("device: backCamera, error: &error");

      captureSession.addInput(input);

      stillImageOutput = iOS.AVFoundation.AVCaptureStillImageOutput();
      stillImageOutput.outputSetting = (Fuse.Scripting.Array)context.Evaluate("AVVideoCodecKey", "AVVideoCodecJPEG");
      //(Fuse.Scripting.Array)context.Evaluate("AVVideoCodecKey", "AVVideoCodecJPEG");

      previewLayer = iOS.AVFoundation.AVCaptureVideoPreviewLayer(session: captureSession);
      previewLayer.videoGravaty = AVLayerVideoGravityResizeAspect();
      previewLayer.connection.videoOrientation = AVCaptureVideoOrientation.Portrait();
      cameraView.layer.adsSublayer(previewLayer);

      captureSession.startRunning();

  } 


  }

but im getting thease errors

  E3203: Unable to resolve data type of meta property, no unambiguous definition of 'captureSession' was found in the block scope
  E3203: Unable to resolve data type of meta property, no unambiguous definition of 'stillImageOutput' was found in the block scope
  E3203: Unable to resolve data type of meta property, no unambiguous definition of 'previewLayer' was found in the block scope
  E3203: Unable to resolve data type of meta property, no unambiguous definition of 'cameraView' was found in the block scope

Hi Kevin,

Although what you are trying to do is possible I think a better way to do it is something along these lines: https://www.fusetools.com/community/forums/howto_discussions/camera_panel_for_ios