Commit 610d4058 authored by root's avatar root
Browse files

华理数字人安卓代码

parent 7545637c
No related merge requests found
Pipeline #1365 failed with stages
in 0 seconds
Showing with 2444 additions and 1 deletion
+2444 -1
*.iml
.gradle
/local.properties
/.idea/
.DS_Store
/build/
*/build/
*/*/build/
*/debug/
*/release/
/captures
.externalNativeBuild
.cxx
*.bat
*.apk
output-metadata.json
# app用到zip的请忽略
# 自定义了local.properties的请删除这条
local.properties
~$*
gradlew
.idea
# 自定义了根目录gradle/gradle-wrapper.properties的请删除这条(不推荐自定义)
#gradle
build
# dh_aigc_android_success
# 硅基数字人SDK
## 一、产品介绍
2D 数字人虚拟人SDK ,可以通过语音完成对虚拟人实时驱动。
### 1. 适用场景
部署成本低: 无需客户提供技术团队进行配合,支持低成本快速部署在多种终端及大屏;
网络依赖小:可落地在地铁、银行、政务等多种场景的虚拟助理自助服务上;
功能多样化:可根据客户需求满足视频、媒体、客服、金融、广电等多个行业的多样化需求
### 2. 核心功能
提供定制形象的 AI 主播,智能客服等多场景形象租赁,支持客户快速部署和低成本运营;
专属形象定制:支持定制专属的虚拟助理形象,可选低成本或深度形象生成;
播报内容定制:支持定制专属的播报内容,应用在培训、播报等多种场景上;
实时互动问答:支持实时对话,也可定制专属问答库,可满足咨询查询、语音闲聊、虚拟陪伴、垂类场景的客服问答等需求。
## 二、SDK集成
### 1. 支持的系统和硬件版本
| 项目 | 描述 |
|--------|------------------------------------------------------------------|
| 系统 | 支持 Android 7.0+ ( API Level 24 )到 Android 13 ( API Level 33 )系统。 |
| CPU架构 | armeabi-v7a, arm64-v8a |
| 硬件要求 | 要求设备 CPU4 核极以上,内存 4G 及以上。可用存储空间 500MB 及以上。 |
| 网络 | 支持 WIF 及移动网络。如果使用云端问答库,设备带宽(用于数字人的实际带宽)期望 10mbps 及以上。 |
| 开发 IDE | Android Studio Giraffe \mid 2022.3.1 Patch 2 |
| 内存要求 | 可用于数字人的内存 >= 400MB |
### 2. SDK集成
在 build.gradle 中增加配置如下
```gradle
dependencies {
// 引用SDK项目
implementation project(":duix-sdk")
// sdk 中使用到 exoplayer 处理音频(必选)
implementation 'com.google.android.exoplayer:exoplayer:2.14.2'
// 云端问答接口使用的SSE组件(非必选)
implementation 'com.squareup.okhttp3:okhttp-sse:4.10.0'
...
}
```
权限要求, AndroidManifest.xml中,增加如下配置
```xml
<manifest xmlns:android="http://schemas.android.com/apk/res/android">
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.CHANGE_WIFI_STATE" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.WAKE_LOCK" />
</manifest>
```
## 三、SDK调用及API说明
### 1. 初始化SDK
在渲染页onCreate()阶段构建DUIX对象并添加回调事件
```kotlin
duix = DUIX(mContext, baseDir, modelDir, mDUIXRender) { event, msg, info ->
when (event) {
ai.guiji.duix.sdk.client.Constant.CALLBACK_EVENT_INIT_READY -> {
initOK()
}
ai.guiji.duix.sdk.client.Constant.CALLBACK_EVENT_INIT_ERROR -> {
}
// ...
}
}
// 异步回调结果
duix?.init()
```
DUIX对象构建说明:
| 参数 | 类型 | 描述 |
|----------|------------|-------------------------------------------|
| context | Context | 系统上下文 |
| baseDir | String | 存放模型驱动的配置文件,需要自行管理. 可将压缩文件解压到外部存储并提供文件夹路径 |
| modelDir | String | 存放模型文件的文件夹,需要自行管理. 可将压缩文件解压到外部存储并提供文件夹路径 |
| render | RenderSink | 渲染数据接口,sdk提供了默认的渲染组件继承自该接口,也可以自己实现 |
| callback | Callback | SDK处理的各种回调事件 |
参考demo LiveActivity示例
### 2. 获取SDK模型初始化状态
```kotlin
object : Callback {
fun onEvent(event: String, msg: String, info: Object) {
when (event) {
"init.ready" -> {
// SDK模型初始化成功
}
"init.error" -> {
//初始化失败
Log.e(TAG, "init error: $msg")
}
// ...
}
}
}
```
### 3. 数字人形象展示
使用RenderSink接口接受渲染帧数据,SDK中提供了该接口实现DUIXRenderer.java。也可以自己实现该接口自定义渲染。
其中RenderSink的定义如下:
```java
/**
* 渲染管道,通过该接口返回渲染数据
*/
public interface RenderSink {
// frame中的buffer数据以bgr顺序排列
void onVideoFrame(ImageFrame imageFrame);
}
```
使用DUIXRenderer及DUIXTextureView控件简单实现渲染展示,该控件支持透明通道可以自由设置背景及前景:
```kotlin
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
// ...
mDUIXRender =
DUIXRenderer(
mContext,
binding.glTextureView
)
binding.glTextureView.setEGLContextClientVersion(GL_CONTEXT_VERSION)
binding.glTextureView.setEGLConfigChooser(8, 8, 8, 8, 16, 0) // 透明
binding.glTextureView.isOpaque = false // 透明
binding.glTextureView.setRenderer(mDUIXRender)
binding.glTextureView.renderMode =
GLSurfaceView.RENDERMODE_WHEN_DIRTY // 一定要在设置完Render之后再调用
duix = DUIX(mContext, duixOptions, mDUIXRender) { event, msg, _ ->
}
// ...
}
```
### 4. 启动数字人播报
在初始化成功后,可以播放音频以驱动形象
```kotlin
duix?.playAudio(wavPath)
```
参数说明:
| 参数 | 类型 | 描述 |
|---------|--------|-----------------------------|
| wavPath | String | 16k采样率单通道的wav文件地址或https网络地址 |
音频播放状态及进度回调:
```kotlin
object : Callback {
fun onEvent(event: String, msg: String, info: Object) {
when (event) {
// ...
"play.start" -> {
// 开始播放音频
}
"play.end" -> {
// 完成播放音频
}
"play.error" -> {
// 音频播放异常
}
"play.progress" -> {
// 音频播放进度
}
}
}
}
```
### 5. 终止当前播报
当数字人正在播报时调用该接口终止播报。
函数定义:
```
boolean stopAudio();
```
调用示例如下:
```kotlin
duix?.stopAudio()
```
### 6. 播放动作区间
当模型中支持播放动作区间时可使用该函数播放多做区间,多个时随机播放。
函数定义:
```
void motion();
```
调用示例如下:
```kotlin
duix?.motion()
```
###
### 四. Proguard配置
如果代码使用了混淆,请在proguard-rules.pro中配置:
```
-keep class com.btows.ncnntest.**{*; }
-dontwarn com.squareup.okhttp3.**
-keep class com.squareup.okhttp3.** { *;}
```
## 五、注意事项
1. 驱动渲染必须正确的配置基础配置文件夹和对应模型文件夹的存储路径。
2. 播放的音频文件不宜过大,过大的音频导入会导致大量消耗CPU,从而造成绘制卡顿。
## 六、版本记录
**<a>3.0.4</a>**
```text
1. 修复部分设备gl默认float低精度导致无法正常显示形象问题。
```
**<a>3.0.3</a>**
```text
1. 优化本地渲染。
```
## 七、其他相关的第三方开源项目
| 模块 | 描述 |
|-----------|--|
| [ExoPlayer](https://github.com/google/ExoPlayer) | 媒体播放器 |
| [okhttp](https://github.com/square/okhttp) | 网络框架 |
| [onnx](https://github.com/onnx/onnx) | 人工智能框架 |
| [ncnn](https://github.com/Tencent/ncnn) | 高性能神经网络计算框架 |
# Silicon-Based Digital Human SDK
## I. Product Introduction
2D digital human virtual human SDK that can be driven in real-time through voice.
### 1. Suitable Scenarios
Low deployment cost: No need for customers to provide technical teams for cooperation, supports low-cost and rapid deployment on various terminals and large screens; Small network dependency: Can be implemented in various scenarios such as subway, bank, and government virtual assistant self-service; Diverse functions: Can meet the diverse needs of video, media, customer service, finance, radio and television and other industries according to customer needs.
### 2. Core Functions
Provide customized AI anchors, smart customer service and other multi-scene image rentals, support customers to deploy quickly and operate at low cost; Exclusive image customization: Support custom exclusive virtual assistant images, optional low-cost or deep image generation; Broadcast content customization: Support custom exclusive broadcast content, used in training, broadcasting and other scenarios; Real-time interactive Q&A: Support real-time dialogue, can also customize exclusive Q&A database, can meet consulting inquiries, voice chat, virtual companions, vertical scene customer service questions and other needs.<br><br>
## II. SDK Integration
### 1. Supported Systems and Hardware Versions
| Item | Description |
| :-------------------- | :----------------------------------------------------------- |
| System | Supports Android 7.0+ (API Level 24) to Android 13 (API Level 33) system. |
| CPU Architecture | armeabi-v7a, arm64-v8a |
| Hardware Requirements | Requires devices with 4-core CPU or higher, 4GB memory or higher, and available storage space of 500MB or higher. |
| Network | Supports WIFI and mobile networks. If using cloud-based Q&A database, the device bandwidth (for the actual bandwidth of digital humans) is expected to be 10mbps or higher. |
| Development IDE | Android Studio Giraffe 2022.3.1 Patch 2 |
| Memory Requirements | Memory available for digital humans >= 400MB |
### 2. SDK Integration
add the following configuration in build.gradle:
```gradle
dependencies {
// reference SDK project
implementation project(":duix-sdk")
// The SDK uses exoplayer to handle audio (required)
implementation 'com.google.android.exoplayer:exoplayer:2.14.2'
// Cloud Q&A interface uses SSE component (optional)
implementation 'com.squareup.okhttp3:okhttp-sse:4.10.0'
...
}
```
Permission requirements, add the following configuration in AndroidManifest.xml:
```xml
<manifest xmlns:android="http://schemas.android.com/apk/res/android">
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.CHANGE_WIFI_STATE" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.WAKE_LOCK" />
</manifest>
```
<br>
## III. SDK Invocation and API Description
### 1. Initialize SDK
Build the DUIX object and add callback events in the render page onCreate() stage:
```kotlin
duix = DUIX(mContext, baseDir, modelDir, mDUIXRender) { event, msg, info ->
when (event) {
ai.guiji.duix.sdk.client.Constant.CALLBACK_EVENT_INIT_READY -> {
initOK()
}
ai.guiji.duix.sdk.client.Constant.CALLBACK_EVENT_INIT_ERROR -> {
}
// ...
}
}
// Asynchronous callback result
duix?.init()
```
DUIX object construction instructions:
| Parameter | Type | Description |
| :-------- | :--------- | :----------------------------------------------------------- |
| context | Context | System context |
| baseDir | String | Stores configuration files for model driving, need to manage it yourself. You can unzip the compressed file to external storage and provide the folder path |
| modelDir | String | Stores model files Folder, need to manage it yourself. You can unzip the compressed file to external storage and provide the folder path |
| render | RenderSink | Rendering data interface, the SDK provides a default rendering component that inherits from this interface, or you can implement it yourself |
| callback | Callback | Various callback events handled by the SDK |
Refer to the LiveActivity demo example
### 2. Get SDK model initialization status
```kotlin
object : Callback {
fun onEvent(event: String, msg: String, info: Object) {
when (event) {
"init.ready" -> {
// SDK model initialization successful
}
"init.error" -> {
// Initialization failed
Log.e(TAG, "init error: $msg")
}
// ...
}
}
}
```
### 3. Digital Human Avatar Display
Use the RenderSink interface to accept rendering frame data; the SDK provides an implementation of this interface, DUIXRenderer.java. You can also implement the interface yourself to customize rendering. The definition of RenderSink is as follows:
```java
/**
* Rendering pipeline, returns rendering data through this interface
*/
public interface RenderSink {
// The buffer data in frame is arranged in bgr order
void onVideoFrame(ImageFrame imageFrame);
}
```
Use DUIXRenderer and DUIXTextureView control to simply implement rendering display; this control supports transparency and can be freely set background and foreground:
```kotlin
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
// ...
mDUIXRender =
DUIXRenderer(
mContext,
binding.glTextureView
)
binding.glTextureView.setEGLContextClientVersion(GL_CONTEXT_VERSION)
binding.glTextureView.setEGLConfigChooser(8, 8, 8, 8, 16, 0) // Transparency
binding.glTextureView.isOpaque = false // Transparency
binding.glTextureView.setRenderer(mDUIXRender)
binding.glTextureView.renderMode =
GLSurfaceView.RENDERMODE_WHEN_DIRTY // Must be called after setting Render
duix = DUIX(mContext, duixOptions, mDUIXRender) { event, msg, _ ->
}
// ...
}
```
### 4. Start Digital Human Broadcast
After initialization is successful, you can play audio to drive the image:
```kotlin
duix?.playAudio(wavPath)
```
Parameter description:
| Parameter | Type | Description |
| :-------- | :----- | :----------------------------------------------------------- |
| wavPath | String | Address or https network address of a 16k sample rate mono channel wav file |
Audio playback status and progress callback:
```kotlin
object : Callback {
fun onEvent(event: String, msg: String, info: Object) {
when (event) {
// ...
"play.start" -> {
// Start playing audio
}
"play.end" -> {
// Complete playing audio
}
"play.error" -> {
// Audio playback exception
}
"play.progress" -> {
// Audio playback progress
}
}
}
}
```
### 5. Terminate Current Broadcast
Call this interface to terminate the broadcast when the digital human is broadcasting.
Function definition:
```kotlin
boolean stopAudio();
```
Example call:
```kotlin
duix?.stopAudio()
```
### 6. Play Action Interval
When the model supports playing action intervals, you can use this function to play multiple intervals; when there are multiple, it plays randomly.
Function definition:
```kotlin
void motion();
```
Example call:
```kotlin
duix?.motion()
```
### 7. Proguard Configuration
If the code uses obfuscation, please configure in proguard-rules.pro:
```pro
-keep class com.btows.ncnntest.** {*; }
-dontwarn com.squareup.okhttp3.**
-keep class com.squareup.okhttp3.** { *;}
```
<br>
## IV. Precautions
1. The basic configuration folder and the corresponding model folder storage path must be correctly configured to drive rendering.
2. The audio file to be played should not be too large; a large audio file import will consume a lot of CPU, causing drawing stuck.<br><br>
## V. Version Record
**3.0.4**
```text
1. Fixed the issue that the default low-precision float of gl on some devices caused the image to not be displayed properly.
```
**3.0.3**
```text
1. Optimized local rendering.
```
<br>
## VI. Other Related Third-Party Open Source Projects
| Module | Description |
| :----------------------------------------------- | :-------------------------------------------------- |
| [ExoPlayer](https://github.com/google/ExoPlayer) | Media player |
| [okhttp](https://github.com/square/okhttp) | Networking framework |
| [onnx](https://github.com/onnx/onnx) | Artificial intelligence framework |
| [ncnn](https://github.com/Tencent/ncnn) | High-performance neural network computing framework |
\ No newline at end of file
<?xml version="1.0" encoding="utf-8" ?>
<!-- https://github.com/bumptech/glide/issues/4940 -->
<lint>
<issue id="NotificationPermission">
<ignore regexp="com.bumptech.glide.request.target.NotificationTarget" />
</issue>
</lint>
\ No newline at end of file
// Top-level build file where you can add configuration options common to all sub-projects/modules.
buildscript {
repositories {
maven { url 'https://maven.aliyun.com/repository/public/' }
maven { url 'https://maven.aliyun.com/repository/central' }
maven { url 'https://maven.aliyun.com/repository/google' }
maven { url 'https://maven.aliyun.com/repository/gradle-plugin' }
maven { url 'https://jitpack.io' }
maven { url 'https://repo1.maven.org/maven2/' }
google()
}
dependencies {
classpath 'com.android.tools.build:gradle:8.1.2'
classpath 'org.jetbrains.kotlin:kotlin-gradle-plugin:1.8.10'
}
}
allprojects {
repositories {
maven { url 'https://maven.aliyun.com/repository/public/' }
maven { url 'https://maven.aliyun.com/repository/central' }
maven { url 'https://maven.aliyun.com/repository/google' }
maven { url 'https://maven.aliyun.com/repository/gradle-plugin' }
maven { url 'https://jitpack.io' }
maven { url 'https://repo1.maven.org/maven2/' }
google()
}
}
ext {
compileSdkVersion = 33
buildToolsVersion = '30.0.2'
minSdkVersion = 24
targetSdkVersion = 33
versionCode = 2
versionName = "0.0.2"
}
File added
/build
\ No newline at end of file
plugins {
id 'com.android.library'
}
android {
namespace 'ai.guiji.duix.sdk.client'
compileSdk 33
defaultConfig {
minSdk 24
versionCode 4
versionName '3.0.3'
externalNativeBuild {
cmake {
abiFilters 'arm64-v8a', "armeabi-v7a"
cppFlags "-std=c++17", "-fexceptions"
//arg0ments "-DANDROID_STL=c++_shared","-DANDROID_TOOLCHAIN=clang"
}
}
}
buildTypes {
debug {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
buildConfigField("String", "VERSION_NAME", "\"${defaultConfig.versionName}\"")
buildConfigField('int', 'VERSION_CODE', "${defaultConfig.versionCode}")
}
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
buildConfigField("String", "VERSION_NAME", "\"${defaultConfig.versionName}\"")
buildConfigField('int', 'VERSION_CODE', "${defaultConfig.versionCode}")
android.libraryVariants.all { variant ->
variant.outputs.all {
outputFileName = "duix_client_sdk_${buildType.name}_${defaultConfig.versionName}.aar"
}
}
}
}
externalNativeBuild {
cmake {
path "src/main/cpp/CMakeLists.txt"
version "3.10.2"
}
}
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
// kotlinOptions {
// jvmTarget = '1.8'
// }
}
dependencies {
// 引用SDK项目
// sdk 中使用到 exoplayer 处理音频(必选)
implementation 'com.google.android.exoplayer:exoplayer:2.14.2'
implementation "org.java-websocket:Java-WebSocket:1.5.1"
implementation 'com.squareup.okhttp3:okhttp-sse:4.10.0'
}
\ No newline at end of file
# Add project specific ProGuard rules here.
# You can control the set of applied configuration files using the
# proguardFiles setting in build.gradle.
#
# For more details, see
# http://developer.android.com/guide/developing/tools/proguard.html
# If your project uses WebView with JS, uncomment the following
# and specify the fully qualified class name to the JavaScript interface
# class:
#-keepclassmembers class fqcn.of.javascript.interface.for.webview {
# public *;
#}
# Uncomment this to preserve the line number information for
# debugging stack traces.
#-keepattributes SourceFile,LineNumberTable
# If you keep the line number information, uncomment this to
# hide the original source file name.
#-renamesourcefileattribute SourceFile
-optimizationpasses 5 #指定代码的压缩级别 0 - 7,一般都是5,无需改变
-dontusemixedcaseclassnames #不使用大小写混合
#告诉Proguard 不要跳过对非公开类的处理,默认是跳过
-dontskipnonpubliclibraryclasses #如果应用程序引入的有jar包,并且混淆jar包里面的class
-verbose #混淆时记录日志(混淆后生产映射文件 map 类名 -> 转化后类名的映射
#指定混淆时的算法,后面的参数是一个过滤器
#这个过滤器是谷歌推荐的算法,一般也不会改变
-optimizations !code/simplification/arithmetic,!field/*,!class/merging/*
#类型转换错误 添加如下代码以便过滤泛型(不写可能会出现类型转换错误,一般情况把这个加上就是了),即避免泛型被混淆
-keepattributes Signature
#假如项目中有用到注解,应加入这行配置,对JSON实体映射也很重要,eg:fastjson
-keepattributes *Annotation*
#抛出异常时保留代码行数
-keepattributes SourceFile,LineNumberTable
#保持 native 的方法不去混淆
-keepclasseswithmembernames class * {
native <methods>;
}
#保持指定规则的方法不被混淆(Android layout 布局文件中为控件配置的onClick方法不能混淆)
-keepclassmembers class * extends android.app.Activity {
public void *(android.view.View);
}
#保持自定义控件指定规则的方法不被混淆
-keep public class * extends android.view.View {
public <init>(android.content.Context);
public <init>(android.content.Context, android.util.AttributeSet);
public <init>(android.content.Context, android.util.AttributeSet, int);
public void set*(...);
}
#保持枚举 enum 不被混淆
-keepclassmembers enum * {
public static **[] values();
public static ** valueOf(java.lang.String);
}
#保持 Parcelable 不被混淆(aidl文件不能去混淆)
-keep class * implements android.os.Parcelable {
public static final android.os.Parcelable$Creator *;
}
#需要序列化和反序列化的类不能被混淆(注:Java反射用到的类也不能被混淆)
-keepnames class * implements java.io.Serializable
#保护实现接口Serializable的类中,指定规则的类成员不被混淆
-keepclassmembers class * implements java.io.Serializable {
static final long serialVersionUID;
private static final java.io.ObjectStreamField[] serialPersistentFields;
!static !transient <fields>;
private void writeObject(java.io.ObjectOutputStream);
private void readObject(java.io.ObjectInputStream);
java.lang.Object writeReplace();
java.lang.Object readResolve();
}
#保持R文件不被混淆,否则,你的反射是获取不到资源id的
-keep class **.R$* { *; }
-keepclassmembers class * {
public <init> (org.json.JSONObject);
}
-keepclassmembers enum * {
public static **[] values();
public static ** valueOf(java.lang.String);
}
#以下针对App本身设置
-keep class com.btows.ncnntest.**{*; }
-keep class ai.guiji.duix.sdk.client.render.** {*;}
-keep class ai.guiji.duix.sdk.client.render.**$* {*;}
-keep class ai.guiji.duix.sdk.client.bean.** {*;}
-keep class ai.guiji.duix.sdk.client.DUIX{*; }
-keep class ai.guiji.duix.sdk.client.DUIX$* {*;}
-keep class ai.guiji.duix.sdk.client.Constant{*; }
-keep class ai.guiji.duix.sdk.client.Constant* {*;}
-keep class ai.guiji.duix.sdk.client.DUIXOptions{*; }
-keep class ai.guiji.duix.sdk.client.DUIXOptions* {*;}
-keep class ai.guiji.duix.sdk.client.Callback{*; }
-keep class ai.guiji.duix.sdk.client.Callback* {*;}
-keep class ai.guiji.duix.sdk.client.render.DUIXTextureView{*; }
-keep class ai.guiji.duix.sdk.client.render.DUIXTextureView$* {*;}
\ No newline at end of file
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android">
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
</manifest>
\ No newline at end of file
#/****************************************************************************
#* Cartoonifier, for Android.
#*****************************************************************************
#* by Shervin Emami, 5th Dec 2012 (shervin.emami@gmail.com)
#* http://www.shervinemami.info/
#*****************************************************************************
#* Ch1 of the book "Mastering OpenCV with Practical Computer Vision Projects"
#* Copyright Packt Publishing 2012.
#* http://www.packtpub.com/cool-projects-with-opencv/book
#****************************************************************************/
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_SRC_FILES += \
base/coffeecatch.c \
android/DigitJni.cpp \
android/MsgcbJni.cpp \
android/JniHelper.cpp \
aisdk/jmat.cpp \
android/kmatarm.cpp \
aisdk/wavreader.cpp \
aisdk/wenet.cpp \
aisdk/aimodel.cpp \
aisdk/scrfd.cpp \
aisdk/pfpld.cpp \
aisdk/munet.cpp \
aisdk/blendgram.cpp \
aisdk/face_utils.cpp \
digit/netwav.cpp \
digit/looper.cpp \
digit/netcurl.cpp \
digit/GRender.cpp \
digit/GDigit.cpp \
digit/dispatchqueue.cpp \
render/EglRenderer.cpp \
render/RgbVideoRenderer.cpp \
render/SurfaceVideoRenderer.cpp \
render/RenderHelper.cpp \
render/AudioTrack.cpp \
render/AudioRenderer.cpp \
render/GlesProgram.cpp \
base/Log.cpp \
base/FrameSource.cpp \
base/MediaData.cpp \
base/MessageSource.cpp \
base/MessageHelper.cpp \
base/LoopThread.cpp \
base/XThread.cpp \
base/XTick.c \
base/cJSON.c \
base/dh_mem.c \
digit/grtcfg.c \
base/LoopThreadHelper.cpp \
)
LOCAL_ARM_NEON := true
LOCAL_MODULE := facedetect
LOCAL_LDLIBS += -llog -ldl -lm -lmediandk
LOCAL_LDLIBS += -lEGL -lGLESv2 -landroid
LOCAL_LDLIBS += -ljnigraphics -fopenmp
LOCAL_CFLAGS += -fpermissive
LOCAL_CPPFLAGS += -fpermissive
#LOCAL_CFLAGS += -ftree-vectorizer-verbose=2
LOCAL_CPPFLAGS += -std=c++17
LOCAL_LDLIBS += -lstdc++
LOCAL_C_INCLUDES += $(LOCAL_PATH)
LOCAL_C_INCLUDES += include
LOCAL_C_INCLUDES += base
LOCAL_C_INCLUDES += aisdk
LOCAL_C_INCLUDES += digit
LOCAL_C_INCLUDES += render
LOCAL_C_INCLUDES += android
LOCAL_C_INCLUDES += third/arm/include
LOCAL_C_INCLUDES += third/arm/include/ncnn
LOCAL_C_INCLUDES += third/opencv-mobile-4.6.0-android/sdk/native/jni/include/
LOCAL_C_INCLUDES += third/ncnn-20221128-android-vulkan-shared/arm64-v8a/include/ncnn
include $(BUILD_SHARED_LIBRARY)
project(scrfdncnn)
cmake_minimum_required(VERSION 3.10.2)
set(CMAKE_CXX_STANDARD 17)
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++17 -funwind-tables")
set(OpenCV_DIR ${CMAKE_SOURCE_DIR}/third/opencv-mobile-4.6.0-android/sdk/native/jni)
find_package(OpenCV REQUIRED core imgproc highgui)
#set(ncnn_DIR ${CMAKE_SOURCE_DIR}/third/ncnn-20221128-android-vulkan-shared/${ANDROID_ABI}/lib/cmake/ncnn)
set(ncnn_DIR ${CMAKE_SOURCE_DIR}/third/ncnn-20231027-android-shared/${ANDROID_ABI}/lib/cmake/ncnn)
find_package(ncnn REQUIRED)
option(USE_OPENCV "shared library support" TRUE)
option(USE_NCNN "shared library support" TRUE)
include_directories(
include
base
render
aisdk
aes
digit
android
third/arm/include
third/arm/include/turbojpeg
)
add_library(scrfdncnn SHARED
android/DigitJni.cpp
android/MsgcbJni.cpp
android/JniHelper.cpp
aisdk/jmat.cpp
aisdk/wavreader.cpp
aisdk/wenet.cpp
aisdk/aimodel.cpp
aisdk/scrfd.cpp
aisdk/pfpld.cpp
aisdk/munet.cpp
aisdk/malpha.cpp
aisdk/wavcache.cpp
aisdk/blendgram.cpp
aisdk/face_utils.cpp
aisdk/netwav.cpp
digit/looper.cpp
digit/netcurl.cpp
digit/GRender.cpp
digit/GDigit.cpp
digit/dispatchqueue.cpp
render/EglRenderer.cpp
render/RgbVideoRenderer.cpp
render/SurfaceVideoRenderer.cpp
render/RenderHelper.cpp
base/BaseRenderHelper.cpp
base/AudioTrack.cpp
render/AudioRenderer.cpp
render/GlesProgram.cpp
base/Log.cpp
base/FrameSource.cpp
base/MediaData.cpp
base/MessageSource.cpp
base/MessageHelper.cpp
base/LoopThread.cpp
base/XThread.cpp
base/XTick.c
base/cJSON.c
base/dh_mem.c
digit/grtcfg.c
base/LoopThreadHelper.cpp
base/coffeecatch.c
aes/aes_cbc.c aes/aes_core.c aes/aes_ecb.c aes/base64.c aes/cbc128.c aes/gj_aes.c
aes/aesmain.c
)
add_library(turbojpeg STATIC IMPORTED)
set_target_properties(turbojpeg
PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/third/arm/${ANDROID_ABI}/libturbojpeg.a)
add_library(libjpeg STATIC IMPORTED)
set_target_properties(libjpeg
PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/third/arm/${ANDROID_ABI}/libjpeg.a)
#add_library(pplcommon STATIC IMPORTED)
#set_target_properties(pplcommon
# PROPERTIES IMPORTED_LOCATION
# ${CMAKE_SOURCE_DIR}/third/arm/${ANDROID_ABI}/libpplcommon_static.a)
#add_library(pplcv STATIC IMPORTED)
#set_target_properties(pplcv
# PROPERTIES IMPORTED_LOCATION
# ${CMAKE_SOURCE_DIR}/third/arm/${ANDROID_ABI}/libpplcv_static.a)
add_library(curl STATIC IMPORTED)
set_target_properties(curl
PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/third/arm/${ANDROID_ABI}/libcurl.a)
add_library(ssl STATIC IMPORTED)
set_target_properties(ssl
PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/third/arm/${ANDROID_ABI}/libssl.a)
add_library(crypto STATIC IMPORTED)
set_target_properties(crypto
PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/third/arm/${ANDROID_ABI}/libcrypto.a)
add_library(avcodec STATIC IMPORTED)
set_target_properties(avcodec
PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/third/arm/${ANDROID_ABI}/ffmpeg-lite/libavcodec.a)
add_library(avformat STATIC IMPORTED)
set_target_properties(avformat
PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/third/arm/${ANDROID_ABI}/ffmpeg-lite/libavformat.a)
add_library(avutil STATIC IMPORTED)
set_target_properties(avutil
PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/third/arm/${ANDROID_ABI}/ffmpeg-lite/libavutil.a)
add_library(swresample STATIC IMPORTED)
set_target_properties(swresample
PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/third/arm/${ANDROID_ABI}/ffmpeg-lite/libswresample.a)
add_library(swscale STATIC IMPORTED)
set_target_properties(swscale
PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/third/arm/${ANDROID_ABI}/ffmpeg-lite/libswscale.a)
find_library(log-lib log)
add_library(onnx-lib SHARED IMPORTED)
set_target_properties(
onnx-lib
PROPERTIES IMPORTED_LOCATION
${CMAKE_SOURCE_DIR}/third/arm/${ANDROID_ABI}/libonnxruntime.so)
target_link_libraries(scrfdncnn
ncnn
${OpenCV_LIBS}
${log-lib}
onnx-lib
camera2ndk
mediandk
libjpeg
turbojpeg
avformat
avcodec
avutil
swresample
swscale
curl
ssl
crypto
-landroid
-lmediandk
-lEGL
-lGLESv2
-lm -lz
)
#ifndef HEADER_AES_H
# define HEADER_AES_H
# include <stddef.h>
# define AES_ENCRYPT 1
# define AES_DECRYPT 0
# define AES_MAXNR 14
# define AES_BLOCK_SIZE 16
struct aes_key_st {
# ifdef AES_LONG
unsigned long rd_key[4 * (AES_MAXNR + 1)];
# else
unsigned int rd_key[4 * (AES_MAXNR + 1)];
# endif
int rounds;
};
typedef struct aes_key_st AES_KEY;
int AES_set_encrypt_key(const unsigned char *userKey, const int bits, AES_KEY *key);
int AES_set_decrypt_key(const unsigned char *userKey, const int bits, AES_KEY *key);
void AES_encrypt(const unsigned char *in, unsigned char *out, const AES_KEY *key);
void AES_decrypt(const unsigned char *in, unsigned char *out, const AES_KEY *key);
void AES_ecb_encrypt(const unsigned char *in, unsigned char *out, const AES_KEY *key,
const int enc);
void AES_cbc_encrypt(const unsigned char *in, unsigned char *out,
size_t length, const AES_KEY *key,
unsigned char *ivec, const int enc);
#endif
/*
* Copyright 2002-2016 The OpenSSL Project Authors. All Rights Reserved.
*
* Licensed under the OpenSSL license (the "License"). You may not use
* this file except in compliance with the License. You can obtain a copy
* in the file LICENSE in the source distribution or at
* https://www.openssl.org/source/license.html
*/
#include "aes.h"
#include "modes.h"
void AES_cbc_encrypt(const unsigned char *in, unsigned char *out,
size_t len, const AES_KEY *key,
unsigned char *ivec, const int enc)
{
if (enc)
CRYPTO_cbc128_encrypt(in, out, len, key, ivec,
(block128_f) AES_encrypt);
else
CRYPTO_cbc128_decrypt(in, out, len, key, ivec, (block128_f) AES_decrypt);
}
This diff is collapsed.
/*
* Copyright 2002-2016 The OpenSSL Project Authors. All Rights Reserved.
*
* Licensed under the OpenSSL license (the "License"). You may not use
* this file except in compliance with the License. You can obtain a copy
* in the file LICENSE in the source distribution or at
* https://www.openssl.org/source/license.html
*/
#include <assert.h>
#include "aes.h"
#include "aes_locl.h"
void AES_ecb_encrypt(const unsigned char *in, unsigned char *out, const AES_KEY *key, const int enc)
{
assert(in && out && key);
assert((AES_ENCRYPT == enc) || (AES_DECRYPT == enc));
if (AES_ENCRYPT == enc)
AES_encrypt(in, out, key);
else
AES_decrypt(in, out, key);
}
/*
* Copyright 2002-2016 The OpenSSL Project Authors. All Rights Reserved.
*
* Licensed under the OpenSSL license (the "License"). You may not use
* this file except in compliance with the License. You can obtain a copy
* in the file LICENSE in the source distribution or at
* https://www.openssl.org/source/license.html
*/
#ifndef HEADER_AES_LOCL_H
# define HEADER_AES_LOCL_H
//# include <e_os2.h>
# include <stdio.h>
# include <stdlib.h>
# include <string.h>
# if defined(_MSC_VER) && (defined(_M_IX86) || defined(_M_AMD64) || defined(_M_X64))
# define SWAP(x) (_lrotl(x, 8) & 0x00ff00ff | _lrotr(x, 8) & 0xff00ff00)
# define GETU32(p) SWAP(*((u32 *)(p)))
# define PUTU32(ct, st) { *((u32 *)(ct)) = SWAP((st)); }
# else
# define GETU32(pt) (((u32)(pt)[0] << 24) ^ ((u32)(pt)[1] << 16) ^ ((u32)(pt)[2] << 8) ^ ((u32)(pt)[3]))
# define PUTU32(ct, st) { (ct)[0] = (u8)((st) >> 24); (ct)[1] = (u8)((st) >> 16); (ct)[2] = (u8)((st) >> 8); (ct)[3] = (u8)(st); }
# endif
# ifdef AES_LONG
typedef unsigned long u32;
# else
typedef unsigned int u32;
# endif
typedef unsigned short u16;
typedef unsigned char u8;
# define MAXKC (256/32)
# define MAXKB (256/8)
# define MAXNR 14
/* This controls loop-unrolling in aes_core.c */
# undef FULL_UNROLL
#endif /* !HEADER_AES_LOCL_H */
#include <stdio.h>
#include <stdlib.h>
#include <stdint.h>
#include <string.h>
#include "gj_aes.h"
#include "aesmain.h"
int mainenc(int enc,char* infn,char* outfn){
char result[255] ;
memset(result,0,255);
char* key = "yymrjzbwyrbjszrk";
char* aiv = "yymrjzbwyrbjszrk";
int base64 = 1;
int outlen = 0;
int encrst = 0;
char* fn1 = infn;
char* fn2 = outfn;
FILE* fr = fopen(fn1,"rb");
FILE* fw = fopen(fn2,"wb");
while(1){
if(!fr){
encrst = -1001;
break;
}
if(!fw){
encrst = -1002;
break;
}
gj_aesc_t* aesc = NULL;
init_aesc(key,aiv,enc,&aesc);
uint64_t size = 0;
uint64_t realsize = 0;
if(enc){
fwrite("gjdigits",1,8,fw);
fwrite(&size,1,8,fw);
fwrite(&size,1,8,fw);
fwrite(&size,1,8,fw);
while(!feof(fr)){
char data[16];
memset(data,0,16);
uint64_t rst = fread(data,1,16,fr);
if(rst){
size +=rst;
do_aesc(aesc,data,16,result,&outlen);
fwrite(result,1,outlen,fw);
}
}
fseek(fw,8,0);
fwrite(&size,1,8,fw);
}else{
uint64_t rst = fread(result,1,32,fr);
if(!rst){
encrst = -1003;
break;
}
if((result[0]!='g')||(result[1]!='j')){
encrst = -1004;
break;
}
uint64_t *psize = (uint64_t*)(result+8);
realsize = *psize;
if(realsize>1034*1024*1024){
encrst = -1005;
break;
}
while(!feof(fr)){
char data[16];
memset(data,0,16);
uint64_t rst = fread(data,1,16,fr);
if(rst){
size +=rst;
do_aesc(aesc,data,16,result,&outlen);
if(size>realsize){
outlen -= (size-realsize);
//printf("===%lu > %lu rst %lu %d outlen \n",size,realsize,rst,outlen);
}
fwrite(result,1,outlen,fw);
}
}
}
break;
}
if(fr) fclose(fr);
if(fw) fclose(fw);
return encrst;
}
#ifdef TEST
int main(int argc,char** argv){
if(argc<4){
printf("gaes enc|dec filein fileout\n");
return 0;
}
char k = argv[1][0];
if(k=='e'){
int rst = mainenc(1,argv[2],argv[3]);
printf("====enc %s to %s rst %d\n",argv[2],argv[3],rst);
return rst;
}else if(k=='d'){
int rst = mainenc(0,argv[2],argv[3]);
printf("====dec %s to %s rst %d\n",argv[2],argv[3],rst);
return rst;
}else{
printf("gaes enc|dec filein fileout\n");
return 0;
}
}
#endif
#ifndef __AESMAIN_H
#define __AESMAIN_H
#include "gj_dll.h"
#ifdef __cplusplus
extern "C" {
#endif
int mainenc(int enc,char* infn,char* outfn);
#ifdef __cplusplus
}
#endif
#endif
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment