Integrating AppGallery Connect Crash in a Xamarin App for Android

Today, we are going to take a look at how we can integrate the AppGallery Connect crash service into your Xamarin app.

But why might you want to do this? The AppGallery Connect Crash service provides a powerful yet lightweight solution to app crash problems. With the service, you can quickly detect, locate, and resolve app crashes (unexpected exits of apps), and have access to highly readable crash reports in real-time, without the need to write any code.

Integrating App Linking in a Xamarin App for Android

Xamarin is a popular cross-platform framework to build mobile applications using .NET.

A number of AppGallery Connect services support many cross-platform frameworks including Xamarin. Today we are going to take a look at how you can use one of these services, App Linking within your Xamarin project.

What Can I Do if a Stack Overflow Occurs in a Quick App?

When content obtained through $element('id') is assigned to a member variable, a stack overflow (RangeError: Maximum call stack size exceeded) may occur and the program will crash. If member variable references exist for a DOM, and a member variable changes, a stack overflow will also occur. The sample code is as follows:

JavaScript
 
<template>
  <div id="content">
    <input type="button" class="button" @click="onTestClick" value="Stack overflow occurs."/>
    <text>{{ stateText }}</text>
  </div>
</template>
<script>
  export default {
    private: {
      mContentNode: null,
      stateText: 'init state'
    },
    onReady() {
      /* When data obtained by $element('id') is assigned to a member variable, a stack overflow may occur. */
      this.mContentNode = this.$element('content')
    },
    onTestClick() {
      /* To reproduce this problem, change a member variable when member variable references exist for a DOM. */
      this.stateText = 'new state'
    }
  }
</script>


How Can I Quickly Integrate AppGallery Connect APM Into a Unity App?

When an app is used, such problems may occur; slow app launch, Application Not Responding (ANR), app crash, and network loading failure. These are the major issues that affect user experience.

To meet the increasing demands of diagnosing performance problems, more and more app performance monitoring services have emerged in the market. HUAWEI AppGallery Connect provides full-process quality services in app development, testing, release, and analysis. If you want to quickly experience this service, see the demo on GitHub.

How to Integrate Remote Configuration of AppGallery Connect in React Native

If you want to experience the remote configuration function, see the GitHub demo.

Integration Procedure

  1. Install the React Native dependency.
    1. npm install -g yarn
  2. Create a project and enable Remote Configuration.
    • a) Create an Android app in AppGallery, add it to a project, enable Remote Configuration, and add a parameter.
    • b) Run the following command to create a React Native project. In this example, we named the project RemoteConfig.
      • npx react-native init RemoteConfig 
    • c) Add the configuration file to your React Native project. Add the agconnect-services.json file to the android/app directory of the React Native project.
    • d) Configure the Maven repository address and AppGallery Connect plug-in address.
      1. Open the build.gradle file in the androiddirectory of your React Native project.
        • Go to allprojects > repositories and configure the Maven repository address.
        • Go to buildscript > repositories and configure the Maven repository address.
        • Go to buildscript > dependencies and configure the AppGallery Connect plug-in address.
      2. Add build dependencies and the AppGallery Connect plug-in address. Open the build.gradle file in the android/app directory of the React Native project and add the plug-in address.
  3. Install the plug-in.
    • Add the plug-in to dependencies under the package.json file in your project.
    • Call npm install or yarn installto install the plug-in.
      • npm install
  4. Use the Remote Configuration service.
    • a) Apply the local settings. You can set the local settings to Map format and call the applyDefault API to apply them.
    • b) Fetch the cloud data or the parameter values fetched last time. Call the fetch API to fetch parameter values from the cloud with an interval. Similarly, call applyLastFetch to fetch data that is fetched from the cloud last time.
    • c) Merge the local and cloud data. Call getMergedAll to merge the local and cloud data.
    • d) Clear the data. Call the clearAll API to clear the cached data that is fetched earlier.
    • e) Fetch the value of a key from the cloud. Call getValue to fetch related data from the cloud.
    • f) Package the APK file. Run the yarn android command under the root directory of your project.
    • g) Result. You can obtain all required parameter values and on-cloud parameter values.

For more:

How to Quickly Integrate Cloud Functions of AppGallery Connect Into Cocos-Based App

1. Environment

2. Enabling and Configuring Cloud Functions in AppGallery Connect

Note: Currently, Cloud Functions is still under beta. To use the service, you need to send an email for application. For details, check here.

1. Create an app first and add it to a project, or select an app from the project list on the My projects page in AppGallery Connect. 

How to Integrate HUAWEI ML Kit’s Image Super-Resolution Capability

Have you ever been sent compressed images that have poor definition? Even when you zoom in, the image is still blurry. I recently received a ZIP file of travel photos from a trip I went on with a friend. After opening it, I found to my dismay that each image was either too dark, too dim, or too blurry. How am I going to show off with such terrible photos? So, I sought help from the Internet, and luckily, I came across HUAWEI ML Kit's image super-resolution capability. The amazing thing is that this SDK is free of charge and can be used with all Android phones. 

Background

ML Kit's image super-resolution capability is backed by a deep neural network and provides two super-resolution capabilities for mobile apps: