Xamarin.Forms Projects – Second Edition

The second edition of Xamarin.Forms Projects is now published. You can buy it from Packt or Amazon. It will also be available in other book stores.

More Information

An example-driven guide to help you build native cross-platform mobile apps using Xamarin, .NET Core 3, and Visual Studio 2019.

Set up Xamarin.Forms for building native apps with code-sharing capabilities

Understand the core aspects of developing a mobile app such as its layout, UX, and rendering

Use custom renderers to gain platform-specific access

Discover how to create custom layouts for your apps with Xamarin.Forms Shell

Use Azure SignalR for implementing serverless services in your Xamarin apps

Create an augmented reality (AR) game for Android and iOS using ARCore and ARKit, respectively

Build and train machine learning models using CoreML, TensorFlow, and Azure Cognitive Services

Xamarin.Forms is a lightweight cross-platform development toolkit for building apps with a rich user interface. Improved and updated to cover the latest features of Xamarin.Forms, this second edition covers CollectionView and Shell, along with interesting concepts such as augmented reality and machine learning.

Starting with an introduction to Xamarin and how it works, this book shares tips for choosing the type of development environment you should strive for when planning cross-platform mobile apps. You’ll build your first Xamarin.Forms app and learn how to use Shell to implement the app architecture. The book gradually increases the level of complexity of the projects, guiding you through creating apps ranging from a location tracker and weather map to an AR game and face recognition. As you advance, the book will take you through modern mobile development frameworks such as SQLite, .NET Core Mono, ARKit, and ARCore. You’ll be able to customize your apps for both Android and iOS platforms to achieve native-like performance and speed. The book is filled with engaging examples, so you can grasp essential concepts by writing code instead of reading through endless theory.

By the end of this book, you’ll be ready to develop your own native apps with Xamarin.Forms and its associated technologies such as .NET Core, Visual Studio 2019, and C#.

Develop mobile apps, AR games, and chatbots of varying complexity with the help of real-world examples

Explore the important features of Xamarin.Forms 4 such as Shell, CollectionView, and CarouselView

Get to grips with advanced concepts such as AR and VR and machine learning for mobile development

Introducing TinySvgHelper

For a pretty long time I have had code that I have used for a couple of different apps that uses SkiaSharp to convert an svg image to a Xamarin.Forms ImageSource. I also blogged about it two years ago, https://danielhindrikes.se/index.php/2018/03/13/use-skiasharp-to-convert-svg-to-xamarin-forms-imagesource/. But I never released it as a library until now.

Get started with TinySvgHelper

The library is published to NuGet, https://www.nuget.org/packages/TinySvgHelper/

To install it, search for TinySvgHelper in the **Nuget Package Manager** or install it with the following command:

Install-Package TinySvgHelper

To use it you must add SvgHelper.Init() to your MainActivity and/or AppDelegate.

MainActivity

protected override void OnCreate(Bundle savedInstanceState)
{
    ....
 
    SvgHelper.Init();
 
    ...
}

AppDelegate

public override bool FinishedLaunching(UIApplication app, NSDictionary options)
{
    ...
 
    SvgHelper.Init();
 
    ...
}

Use TinySvgHelper

You can use TinySvgHelper either from your XAML- or C# code. For iOS add the svg files to the **Resources** folder and for Android add the svg files to the **Assets** folder.

XAML

First you need to import the namespace:

xmlns:svg="clr-namespace:TinySvgHelper;assembly=TinySvgHelper"

Then you will use it as a markup extension:

<Image Source="{svg:Svg FileName='logo.svg', Height=200,Width=200}" />

You can also specify a color to it:

<ShellContent Icon="{svg:Svg FileName='info.svg', Height=22,Width=22,Color=Black}" Title="Start">
    <views:StartView />
</ShellContent>
```
 
### C#
<pre lang="csharp">
using TinySvgHelper;
var image = new Image();
image.Source = SvgHelper.GetAsImageSource("info.svg", 50, 50, Color.Blue);

Read more

You can find the full documentation, the code and a sample project on GitHub, https://github.com/TinyStuff/TinySvgHelper

App In The Cloud: Native mobile, desktop and WebAssembly apps with C# and XAML using the Uno Platform with Jérôme Laban

With Uno, you can build native apps and Web Assembly apps with C# and XAML using WinRT (UWP) APIs. In this episode, Daniel is talking with Jérôme Laban, the CTO of Uno Platform about what Uno is, the history of Uno, the current state of Uno and much more.

Jérôme Laban on LinkedIn: https://www.linkedin.com/in/jeromelaban/
Jérôme Laban on Twitter: https://twitter.com/jlaban

Uno website: https://platform.uno/
Uno on GitHub: https://github.com/unoplatform/uno
Uno on Twitter: https://twitter.com/UnoPlatform

Daniel Hindrikes on twitter: https://twitter.com/hindrikes
App In The Cloud on Facebook: https://www.facebook.com/appinthecloud
App In the Cloud on Twitter: https://twitter.com/AppInTheCloudPod

Intro music with a loop from https://www.looperman.com/

App In The Cloud – Live Code – Episode 1

Together with my friends and colleagues, Johan Karlsson and Mats Törnberg, we have started to live code on Twitch, https://www.twitch.tv/danielhindrikes.
Right now we have started to build a Xamarin.Forms app from scratch, the plan is to continue to build that app, live on Twitch every Monday at 12:00 CET (Swedish time). If you missed it, we will also publish all episodes on YouTube.

Here is the recording from the stream:

The Code Behind: AR for dummies

In the book Xamarin.Forms Project by me and Johan Karlsson there is a chapter about how to build an AR app using ARKit, ARCore and UrhoSharp.

In the podcast, The Code Behind, (in Swedish) that we are recording we speak about that topic. So if you don't speak Swedish, this is a great chance to learn 😉

MachineLearning with CoreML and Xamarin.iOS

In iOS 11, Apple introduced CoreML. CoreML makes it possible to do prediction on trained models locally on iOS devices. For image predictions I recommend you to read my earlier blog post, https://danielhindrikes.se/index.php/2018/07/05/using-machine-learning-for-image-classification-in-your-apps/. This blog post will focus on how to make prediction with text input.

If you don't have a model that you have trained yourself, you can try to search for a model on the internet. There are a couple of great collections of CoreML models out there.

The model that we will use in this post is from coreml.store and is a model for sentimental analysis of text.

The first thing we will do is to import the model to Visual Studio (for Mac). The model should be placed in the Resources folder oy the iOS-project. When we are importing a model, Visual Studio will generate code that we can use to use the model.

If we want, we can delete the code and write it ourselves. But I prefer to use the generated code. But with this model, we need to edit it a little bit because there is a naming conflict (Note that is always a risk that the code is regenerated and our changes will be overwritten, so we will keep changes in this file as small as possible). The conflict is in the GetPrediction method at the end of the file. I'm changing the name of the declared variable for SentimentPolarityInput to be sentimentPolarityInput instead of input.

public SentimentPolarityOutput GetPrediction (NSDictionary<NSObject, NSNumber> input, out NSError error)
{
     var sentimentPolarityInput = new SentimentPolarityInput (input);
 
     return GetPrediction (sentimentPolarityInput, out error);
}

Next step is to compile the model.

var assetPath = NSBundle.MainBundle.GetUrlForResource("SentimentPolarity", "mlmodel");
 
var compiledUrl = MLModel.CompileModel(assetPath, out var error);

With the URL for the compiled, we can create an instance of the class the was generated when we imported the model to Visual Studio.

var ml = SentimentPolarity.Create(assetPath, out var error);

Before we can do any prediction we need to prepare the data a little bit, that because of that the GetPrediction method needs an NSDictionary. In this case, it means a dictionary of all words in the text and how many times they occur in the text. If you downloading a model that someone else has created I recommend you to take a look if there is any sample code about how to use it. Sample projects are often written in Swift- or Objective-C but is often not a problem to figure out what they are doing with the model.

What we will do here is to split the text by whitespace, we will also ignore all words that are just two characters or shorter. This because those words will probarly not affect thre result.

 
var dictionary = new Dictionary<NSObject, NSNumber>();
 
var words = text.Split(' ');
 
foreach (var word in words)
{
     if (word.Length > 2)
     {
          var token = NSObject.FromObject(word);
 
          if (dictionary.ContainsKey(token))
          {
               dictionary[token] = NSNumber.FromDouble(dictionary[token].DoubleValue + 1.0);
          }
          else
          {
               dictionary.Add(token, 1.0);
          }
     }
}
 
var data = new NSDictionary<NSObject, NSNumber>(dictionary.Keys.ToArray(), dictionary.Values.ToArray());

Now when we have prepared the data we are ready to do predictions.

var result = ml.GetPrediction(new SentimentPolarityInput(data), out var predictionError);
var positiveScore = (NSNumber)result.ClassProbability["Pos"];
var negativeScore = (NSNumber)result.ClassProbability["Neg"];
 
if(postiveScore > negativeScore)
{
    Console.WriteLine("The text is positive");
}
else
{
    Console.WriteLine("The text is negative");
}

Now we now we knwo the basic about how to use CoreML in an Xamarin.iOS app and we can start to build smarter apps based on machine learning!

Siri Shortcuts with Xamarin

With iOS 12, Apple introduced Siri shortcuts. For us developers, we can create shortcuts using SiriKit. In this blog post, we will walk through how to do that in an app that we are building with Xamarin.iOS.

The first step is to update our provisioning profile.

    1. Sign in to https://developer.apple.com.
    2. Go to the account page
    3. Go to Certificates, Identifiers & Profiles
    4. Go to App IDs
    5. Select the App ID that you want to use and press the edit button.
    6. Enable the SiriKit service for the selected App ID.

 

    1. Go to publishing profiles and select the profile you want to use for your app that you will implement SiriKit in.
    2. Select to edit the publishing profile
    3. Regenerate the publishing profile
    4. Download the publishing profile and install it on the machine that you are using for building iOS app.

Now when we have the publishing profile we can open the app solution in Visual Studio to start to implement SiriKit. The first thing we will do is to add Siri to our Entitlements.plist file.

<plist version="1.0">
    <dict>
	<key>com.apple.developer.siri</key>
	<true/>
    </dict>
</plist>

 

We need to verify that the Entitlements.plist is used as Custom Entitlements. We have to add the file for each configuration we want to use shortcuts for. Configurations for Custom Entitlements is one under the bundle signing page of the project settings.

We need to add all shortcuts that we will create to Info.plist. We will add them under the NSUserActivityTypes key. The nameing recommendation is {bundleID}.{activityName}. In the examle below there are shortcuts for sedning a message and one for reading unread messages.

<key>NSUserActivityTypes</key>
	<array>
		<string>se.danielhindrikes.DemoApp.sendMessage</string>
		<string>se.danielhindrikes.DemoApp.showUnreadMessages</string>
	</array>

Shortcuts are created in the ViewController that it should be shortcut to. The idea is that shortcuts should be created for pages in the apps that you use often.

If the apps have a minimum iOS version that is lower than 12, we need to verify that the user running the app on iOS 12 or higher.

if (UIDevice.CurrentDevice.CheckSystemVersion(12, 0))
{
    //Add code here
}

Next step is to create a UserActivity (shortcut). Here we creating a separate method for that.

public const string ViewMenuActivityType = "se.danielHindrikes.DemoApp.sendMessage";
 
private NSUserActivity GetSendMessageUserActivity()
{
      //Create a user activity and make it eligible for Siri to find for search and predictions.
      var userActivity = new NSUserActivity(ViewMenuActivityType)
      {
          Title = "Send message",
          EligibleForSearch = true,
          EligibleForPrediction = true
      };
 
      //Add som attributes to the user activity
      var attributes = new CSSearchableItemAttributeSet("SendMessage")
      {
          ThumbnailData = UIImage.FromBundle("Icon.png").AsPNG(),
          Keywords = "send, message, daniel, hindrikes",
          DisplayName = "Send message",
          ContentDescription = "Send message in the Daniel Hindrikes Demo App"
      };
 
      userActivity.ContentAttributeSet = attributes;
 
      //Add a suggestion for a phrase the user can use to activate the shortcut via Siri.
      var phrase = "Send message in the Daniel Hindrikes Demo App";
      userActivity.SuggestedInvocationPhrase = phrase;
      return userActivity;
}

When we have created the UserActivity we can use it to set the UserActivity property of the ViewController. We will do this in the ViewDidLoad override.

public override void ViewDidLoad()
{
     base.ViewDidLoad();
 
     if (UIDevice.CurrentDevice.CheckSystemVersion(12, 0))
     {
           UserActivity = GetSendMessageUserActivity();
           UserActivity.BecomeCurrent();
     }
}

If we are developing a Xamarin.Forms app, we can access the ViewController via a Custom Renderer for the page.

[assembly: ExportRenderer(typeof(SendMessageView), typeof(CustomSendMessageViewViewRenderer))]
namespace Se.Co.App.iOS.Renderers
{
    public class CustomSendMessageViewViewRenderer : PageRenderer
    {
 
        public override void ViewDidLoad()
        {
            base.ViewDidLoad();
 
            if (UIDevice.CurrentDevice.CheckSystemVersion(12, 0))
            {
                 ViewController.UserActivity = GetShowStoresUserActivity();
                 ViewController.UserActivity.BecomeCurrent();
            }
        }

The last thing we need to do is to add code to the AppDelegate.cs that handles what should happen when a shortcut is activated.

public override bool ContinueUserActivity(UIApplication application, NSUserActivity userActivity, UIApplicationRestorationHandler completionHandler)
{
            if(userActivity.ActivityType == SendMessageViewController.ViewMenuActivityType)
            {
                //Navigate to the send message view.
 
                return true;
            }
            else if (userActivity.ActivityType == ShowUnreadMessagesViewController.ViewMenuActivityType)
            {
                //Navigate to the show unread messages view
 
                return true;
            }
 
            return base.ContinueUserActivity(application, userActivity, completionHandler);
}

To make it easier to test the shortcuts go to Settings and then to Developer and then enable, Display Recent Shortcuts and Display Donations on Lock Screen.

Now when you have deployed the app and run the views that added UserActivities, you will see shortcuts when you are swiping down at the home screen. You can also see all shortcuts if you open Siri under Settings.

Read more about how to use SiriKit in Xamarin.iOS here, https://docs.microsoft.com/en-us/xamarin/ios/platform/sirikit/