Zenbo SDK - Getting Started
System Requirement
Zenbo is based on Android M. You can use Android Studio or other IDE for android developing. Please refer to Android Studio for more information.
SDK package
- Java Jar file (ZenboSDK.jar)
- Simple library with extended Android Activity
- RobotDevExample source code
- Javedoc for ZenboSDK
SDK equipment
- Android M, API level 23
- Dependency Google GSON library
About Zenbo SDK
DS: Dialogue System
CSR: Continuous Speech Recognition
SLU: Spoken Language Understanding
APP bring up
Prepared by developer:
- Developer adds cross intent to the DS editor, such as “take a photo”, then edit your package name and launch Activity name.
- Use the DS editor to define app actions so that you can control the app by using voice commands.
- Develop your app by using the Zenbo SDKs and include subclass motion, vision, robot, and utility.
Executed by user:
- Use the voice command “Hey Zenbo” to open the CSR, and then tell the cross intent (e.g. take a photo).
- The DS system sends the voice command to the Cloud and returns the SLU result.
- The robot framework opens the app by parsing the SLU result.
- App receives in-app SLU result that was defined in the DS editor by using Zenbo SDKs and performs the corresponding action.
- APP can access the robot’s functions by the Zenbo SDKs.
Zenbo SDK Architecture
Subclass
The Zenbo SDK includes the following subclass:
- Robot: Dialog system, Expression.
- Vision: Face detect/recognize, Body tracing, Arm gesture, Measure Height.
- Motion: Move Body/Head, Remote control.
- Utility: Follow user, Go there by gesture, Play emotional action.
- WheelLights: Blinking, Breathing, Charging, Marquee.
Callback
Every function returns a command serial number. Developer can use the serial number to verify the result.
- General callback
- onStateChange: return command status, includes ACTIVE, PENDING, SUCCESS, FAIL. If the callback returns ”FAIL”, the parameter contains(?) an error code.
- onResult: return parameter when command process.
- DS callback: Listen class
- onResult: SLU result.
- onRetry: automatically ask again when the DS can’t recognize the result.
- Vision callback
- onRecognizePersonResult: return face recognize result, includes face location, user id, face box location, etc.
- onDetectPersonResult: return person location.
- onGesturePoint: return arm gesture point position.
SLU example
{
"system_info": {
"version_info": "ver1.0"
},
"event_slu_query": {
"user_utterance": [
{
"CsrType": "Google",
"result": [
"show me a",
"show me a call",
"show me",
"show me all",
"show me a song"
]
},
{
"CsrType": "vocon",
"result": "show me the photo"
}
],
"correctedSentence": "show me the photo",
"error_code": "success",
"app_semantic": {
"IntentionId": "gallery.photo.request",
"Domain": "49",
"domain": "com.asus.robotgallery",
"CrossIntent": true,
"output_context": [
"gallery_device_previous_state",
"gallery_device_choose",
"gallery_device_choose_number",
"gallery_cancel",
"gallery_quit",
"gallery_show_tutorial",
"gallery_repeat_tts"
],
"Phrase": []
},
"speaker_id": "",
"doa": 0
}
}
Please refer to the DS document for more information.
Important notes on Zenbo APP developing
Manifest file
Zenbo SDK requires the following tags:
<application
android:allowBackup="true"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:supportsRtl="true"
android:theme="@style/AppTheme">
<meta-data android:name="zenbo_ds_domainuuid" android:value="82F199B9E7774C688114A72457E3C223"/>
<meta-data android:name="zenbo_ds_version_82F199B9E7774C688114A72457E3C223" android:value="0.0.1" />
<activity
android:name=".MainActivity"
...
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
<category android:name="com.asus.intent.category.ZENBO" />
<category android:name="com.asus.intent.category.ZENBO_LAUNCHER" />
</intent-filter>
</activity>
</application>
Note:
<meta-data android:name="zenbo_ds_domainuuid" android:value="82F199B9E7774C688114A72457E3C223"/> and
<meta-data android:name="zenbo_ds_version_82F199B9E7774C688114A72457E3C223" android:value="0.0.1" />:
It is DDE definition. You must write the information in Manifest, or voice command can’t open app. “2F199B9E7774C688114A72457E3C223” is domain UUID. “0.0.1” is the version of your DDE table. Please reference DDE guild for detail.
<category android:name="android.intent.category.LAUNCHER" />:
We recommend removing this line. When this line is removed, the user can’t open apps by touch and is only allowed to open apps by using voice commands.
<category android:name="com.asus.intent.category.ZENBO" />:
This line declares that this activity is compatible with Zenbo SDK. This category is used by the Zenbo app to list the compatible apps installed on the robot.
<category android:name="com.asus.intent.category.ZENBO_LAUNCHER" />:
This line allows your app to show up in the Zenbo Launcher.
Get cross-intent SLU form intent at APP startup
Intent intent = getIntent();
JSONObject slu_json;
try {
slu_json = new JSONObject(intent.getStringExtra("json"));
} catch (NullPointerException | JSONException ex) {
ex.printStackTrace();
}
OnCreate - declare RobotAPI
RobotAPI mRobotAPI;
@Override
protected void ononCreate (Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
mRobotAPI = new RobotAPI(thisContext, callbackHandle);
}
onPause/onResume:
@Override
protected void onResume() {
super.onResume();
mRobotAPI.robot.registerListenCallback(dsCallback);
}
@Override
protected void onPause() {
super.onPause();
mRobotAPI.robot.unregisterListenCallback();
}
Note:
When the app is in the “onPause” status, callback will still work. The app needs to call unregisterListenCallback to block the connection. If the app has previously blocked the connection and the app status is changed back to “onResume”, the app needs to call registerListenCallback to reconnect the link.
Navigation and status bar
We recommend hiding the navigation and status bar for the UI.
Extend RobotActivity
RobotActivity is the starting point for coding an app using the Zenbo SDK. RobotActivity is the base activity that provides easy integration with Zenbo, and is commonly required when creating an activity for Zenbo. Please refer to the example app.
Understanding RobotDevExample APP
Each app includes 3 modules:
- ZenboSDK: module of Zenbo SDK.
- RobotActivityLibrary: AAR module with extended activity and imported Zenbo SDK.
- RobotDevSample-RobotDevSample: sample code for Zenbo SDK.
Motion example: MotionMoveBodyHead.java
This is an example to use the Zenbo SDK. You can check the option motion->moveBody & moveHead.
You can enter the x, y, and theta values to let Zenbo move its body.
mBtn_MoveBody.setOnClickListener(new Button.OnClickListener() {
@Override
public void onClick(View v) {
float x = Float.valueOf(mEditTextMoveBodyX.getText().toString());
float y = Float.valueOf(mEditTextMoveBodyY.getText().toString());
float theta = Float.valueOf(mEditTextMoveBodyTheta.getText().toString());
robotAPI.motion.moveBody(x, y, theta);
}
});
You can also set the speed level.
mBtn_moveBodyLevel.setOnClickListener(new Button.OnClickListener() {
@Override
public void onClick(View v) {
float x = Float.valueOf(mEditTextMoveBodyLevelX.getText().toString());
float y = Float.valueOf(mEditTextMoveBodyLevelY.getText().toString());
float theta = Float.valueOf(mEditTextMoveBodyLevelTheta.getText().toString());
int selectedLevel = mSpinnerMoveBodyLevel.getSelectedItemPosition() + 1;
MotionControl.SpeedLevel.Body level = MotionControl.SpeedLevel.Body.getBody(selectedLevel);
robotAPI.motion.moveBody(x, y, theta, level);
}
});
moveHead is similar to moveBody. You can enter the yaw, pitch, and speed level.
mBtn_moveHead.setOnClickListener(new Button.OnClickListener() {
@Override
public void onClick(View v) {
String tmp = mEditText_head_pitch.getText().toString();
float pitch = TextUtils.isEmpty(tmp) ? 0 : (float) Math.toRadians(Float.valueOf(tmp));
tmp = mEditText_head_yaw.getText().toString();
float yaw = TextUtils.isEmpty(tmp) ? 0 : (float) Math.toRadians(Float.valueOf(tmp));
int selectedLevel = mSpinnerMoveHeadLevel.getSelectedItemPosition() + 1;
MotionControl.SpeedLevel.Head level = MotionControl.SpeedLevel.Head.getHead(selectedLevel);
robotAPI.motion.moveHead(yaw, pitch, level);
}
});
And you can get the result by callback.
public static RobotCallback robotCallback = new RobotCallback() {
@Override
public void onStateChange(int cmd,int serial,RobotErrorCode err_code,RobotCmdState state) {
super.onStateChange(cmd, serial, err_code, state);
}
}
Finally, you can stop the motion by the stopping function.
Button stop = (Button) findViewById(R.id.stopButton);
stop.setOnClickListener(new Button.OnClickListener() {
@Override
public void onClick(View v) {
robotAPI.motion.stopMoving();
}
});
Speak example: RobotSpeak.java
Zenbo can speak some words by your inputs. Same with motions, you can get successful callback from onStateChange.
mEdit = (EditText)findViewById(R.id.speak_edittext);
btn_start_speak.setOnClickListener(new Button.OnClickListener(){
@Override
public void onClick(View v) {
robotAPI.robot.speak(mEdit.getText().toString());
}
});
And you can stop Zenbo from talking when it is speaking.
btn_stop_speak.setOnClickListener(new Button.OnClickListener(){
@Override
public void onClick(View v) {
robotAPI.robot.stopSpeak();
}
});
Expression example: RobotSetExpression.java
Zenbo can change its facial expressions and has 24 different faces for you to choose from.
if(SpinnerText.equals("INTEREST")){
robotAPI.robot.setExpression(RobotFace.INTEREST);
}
else if(SpinnerText.equals("DOUBT")){
robotAPI.robot.setExpression(RobotFace.DOUBT);
}
else if(SpinnerText.equals("PROUD")){
robotAPI.robot.setExpression(RobotFace.PROUD);
}
else if(SpinnerText.equals("DEFAULT")){
robotAPI.robot.setExpression(RobotFace.DEFAULT);
}
else if(SpinnerText.equals("HAPPY")){
robotAPI.robot.setExpression(RobotFace.HAPPY);
}
PlayAction example: UtilityPlayAction.java
Zenbo has several canned action to express emotions.
mEditText.setText("22"); //#22 action
btn_start.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View arg0) {
if(mEditText.getText().length() > 0) {
int iCuttentNumberPickerValue = Integer.parseInt(mEditText.getText().toString());
robotAPI.utility.playAction(iCuttentNumberPickerValue);
}
}
});
Some actions are on looping. You can stop the loop by a cancel command.
btn_stop.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View arg0) {
robotAPI.cancelCommand(RobotCommand.MOTION_PLAY_ACTION.getValue());
}
});
Emotional example: UtilityPlayEmotionalAction.java
Zenbo also has a function that combines face and canned action. You can also set the face to be randomized. Example 1 below is a canned action with 5 possible faces picked at random.
if(SpinnerText.equals("Example 1")){
List<RobotUtil.faceItem> faceItemList = new ArrayList<>();
faceItemList.add(new RobotUtil.faceItem(RobotFace.DEFAULT, 10));
faceItemList.add(new RobotUtil.faceItem(RobotFace.HAPPY, 10));
faceItemList.add(new RobotUtil.faceItem(RobotFace.EXPECT, 10));
faceItemList.add(new RobotUtil.faceItem(RobotFace.SHOCK, 10));
faceItemList.add(new RobotUtil.faceItem(RobotFace.LAZY, 10));
iCurrentCommandSerial = robotAPI.utility.playEmotionalAction(faceItemList, 5);
}
Detect person example: VisionRequestDetectPerson.java
Zenbo can detect a person and return callback.
private void detectPersonClicked() {
robotAPI.vision.requestDetectPerson( 1 );
}
And the result will report the location by callback: onDetectPersonResult.
@Override
public void onDetectPersonResult(List<DetectPersonResult> resultList) {
super.onDetectPersonResult(resultList);
if (resultList.size() == 0)
Log.d("RobotDevSample", "onDetectPersonResult: empty");
else
Log.d("RobotDevSample", "onDetectPersonResult: " + resultList.get(0).getBodyLoc().toString());
}
WheelLights Example: WheelLightsActivity.java
Zenbo has two side wheel LEDs. You can control the LED brightness, color, and pattern.
case 2:
//.startBlinking
robotAPI.wheelLights.turnOff(WheelLights.Lights.SYNC_BOTH, 0xff);
robotAPI.wheelLights.setColor(WheelLights.Lights.SYNC_BOTH, 0xff, 0x007F7F);
robotAPI.wheelLights.setBrightness(WheelLights.Lights.SYNC_BOTH, 0xff, 10);
robotAPI.wheelLights.startBlinking(WheelLights.Lights.SYNC_BOTH, 0xff, 30, 10, 5);
break;
case 3:
//.startBreathing
robotAPI.wheelLights.turnOff(WheelLights.Lights.SYNC_BOTH, 0xff);
robotAPI.wheelLights.setColor(WheelLights.Lights.SYNC_BOTH, 0xff, 0x00D031);
robotAPI.wheelLights.setBrightness(WheelLights.Lights.SYNC_BOTH, 0xff, 10);
robotAPI.wheelLights.startBreathing(WheelLights.Lights.SYNC_BOTH, 0xff, 20, 10, 0);
break;
case 4:
//.startCharging
robotAPI.wheelLights.turnOff(WheelLights.Lights.SYNC_BOTH, 0xff);
robotAPI.wheelLights.setColor(WheelLights.Lights.SYNC_BOTH, 0xff, 0xFF9000);
robotAPI.wheelLights.setBrightness(WheelLights.Lights.SYNC_BOTH, 0xff, 10);
robotAPI.wheelLights.startCharging(WheelLights.Lights.SYNC_BOTH, 0, 1, WheelLights.Direction.DIRECTION_FORWARD, 20);
break;
case 5:
//.startMarquee
robotAPI.wheelLights.turnOff(WheelLights.Lights.SYNC_BOTH, 0xff);
robotAPI.wheelLights.setBrightness(WheelLights.Lights.SYNC_BOTH, 0xff, 20);
robotAPI.wheelLights.startMarquee(WheelLights.Lights.SYNC_BOTH, WheelLights.Direction.DIRECTION_FORWARD, 40, 20, 14);
break;