Filter
-
Content Type
-
Category
Mobile/Wearable
Visual Display
Digital Appliance
Platform
Recommendations
Filter
Learn Code Lab
codelabimplement keyevent callback by mapping air actions objective learn how to implement keyevent callback on your s pen remote by mapping the air actions to a car racing game overview s pen connects via bluetooth low energy ble to a device and the s pen framework manages the connection ble events are converted by the s pen framework into keyevents before sending to the app apps are able to handle s pen events by reusing existing keyevent callback without adding other interfaces collecting and sending s pen remote events to apps the s pen framework collects and manages keyevents to be received by the app, and are defined in xml format before making them public below is the process of the s pen remote event handling and sending s pen remote event-defined keyevents to apps ble event is sent to the s pen framework the s pen framework checks for the foreground app and looks for the keyevent that the app made public the found keyevent is sent to the app's keyevent callback the app performs the actions defined in the keyevent set up your environment you will need the following java se development kit 10 jdk or later android studio latest version recommended samsung galaxy device with s pen remote capability galaxy tab s6 series or newer galaxy s22 ultra or newer galaxy fold3 or newer sample code here is a sample code for you to start coding in this code lab download it and start your learning experience! air actions sample code 6 4 mb start your project in android studio, click open an existing android studio project locate the android project from the directory and click ok implement remote actions check for the activity to handle the keyevent in a manifest file, androidmanifest xml add <intent-filter> and <meta-data> elements to that activity androidmanifest xml resource file defining remoteactions must be specified for <meta-data> <?xml version="1 0" encoding="utf-8"?> <manifest xmlns android="http //schemas android com/apk/res/android" package="com example spenremote racingcar"> <application <activity android name=" mainactivity" <intent-filter> <action android name="com samsung android support remote_action" /> </intent-filter> <meta-data android name="com samsung android support remote_action" android resource="@xml/remote_action_sample"/> </activity> </application> </manifest> noteonly one remoteaction per app is allowed at the moment if you define remoteactions to several activities, all other except one may be ignored create an xml file under res/xml/ name the file with the same name as the resource specified previously create the desired <action> elements in the xml file created as seen below xml has a root element of <remote-actions>, and may include several <action> elements in addition, each <action> contains information about id, label, priority, trigger_key, etc if you want to map a default action to a specific gesture, you need to add a <preference> element <?xml version="1 0" encoding="utf-8"?> <remote-actions version="1 2"> <action id="pause_or_resume" label="@string/pause_or_resume" priority="1" trigger_key="space"> <preference name="gesture" value="click"/> <preference name="button_only" value="true"/> </action> <action id="move_left" label="@string/move_car_left" priority="2" trigger_key="dpad_left"> <preference name="gesture" value="swipe_left"/> <preference name="motion_only" value="true"/> </action> <action id="move_right" label="@string/move_car_right" priority="3" trigger_key="dpad_right"> <preference name="gesture" value="swipe_right"/> <preference name="motion_only" value="true"/> </action> <action id="restart" label="@string/restart" priority="4" trigger_key="ctrl+r"> <preference name="gesture" value="circle_cw|circle_ccw"/> <preference name="motion_only" value="true"/> </action> </remote-actions> implement keyevent callback apply the keyevent callback to the activity where remoteactions have been declared it is recommended to handle the sent keyevent at onkeydown tiplearn more about keyevent callback in android developers add the following keyevent callback as seen below space key pause or resume the racing game left key move the car to left direction right key move the car to right direction ctrl + r key restart the racing game @override public boolean onkeydown int keycode, keyevent event { if keycode == keyevent keycode_space { racingcar playstate state = racingcar getplaystate ; if state == racingcar playstate playing { pause ; } else { resume ; } } else if keycode == keyevent keycode_dpad_left { movecarto racingcar pos_left ; } else if keycode == keyevent keycode_dpad_right { movecarto racingcar pos_right ; } else if event getmetastate & keyevent meta_ctrl_on != 0 && keycode == keyevent keycode_r { restart ; } return super onkeydown keycode, event ; } notepay attention to cases when the child view consumes the keyevent for example, if the edittext is focused on a screen where the edittext co-exists, most keyevents will be consumed for text inputs or a cursor movement and they may not be sent to the activities or container layouts on this screen, you must specify the key code or key combination that the child view cannot consume run the app now, everything is ready when you install your app, make sure that your app icon appears properly in the settings, settings > advanced features > s pen > air actions you can now control the car in the game using air actions you're done! congratulations! you have successfully achieved the goal of this code lab now, you can map air actions events on a game or app by yourself! if you're having trouble, you may download this file air actions complete code 6 4 mb
tutorials game, mobile
blogintro foldable technology for mobile is a ground-breaking experience not only for users, but also for developers. the presence of many form factors, like immersive display, app continuity, flex mode, and ux optimization, challenge developers to think outside of the box to adapt to this technology. there are already a large number of blogs talking about general application considerations, but what about gaming on a foldable device? in this blog, we look at the challenges and new opportunities foldable devices present for mobile game development. this blog focuses on unity and unreal engine as they are the most common game engines for mobile development. however, a lot of the information applies to other engines as well. app continuity and multi-window mode firstly, let's establish that the large internal display is called the main display whilst the smaller external display is called the cover display. app continuity and multi-window mode are two key features of foldable smartphones. with app continuity, you can seamlessly transition between the cover display to the main display without needing to reopen the app. with multi-window, you can run multiple apps at once and multitask like never before. app continuity moving an app between the two displays affects the size, density, and aspect ratio of the display it can use. app continuity is enabled and disabled from the android manifest of the app: the <activity> element has a new attribute called resizeableactivity. if not set, this attribute defaults to true, thus enabling app continuity and assuming the app fully supports both app continuity and multi-window mode. if set to true then the application automatically attempts to resize when moving between displays, as shown in the video below. if set to false then the system still allows you to transition from the cover display to the main display but does not attempt to resize the application. instead, a new button appears, allowing the user to restart the application at a time of their choosing, as shown below. when the app transitions between the two displays, the activity is destroyed and recreated. as such, the app data needs to be stored and then used to restore the previous state. from testing, both unity and unreal engine appear to handle this process already. however, if you are developing on a custom engine, it is worth confirming that your engine/app can handle this. continue on cover screen by default, foldable devices simply lock the device when closing the main display. however, it is possible for users to disable this functionality for certain apps. in the device's display settings, there is an option called "continue on cover screen" that has to be enabled by the user. entering this menu displays a list of all applications on the device and enables users to set specific applications to not lock the device when closing the main display. if an application has resizeableactivity set to false then the option to enable this functionality is greyed out and is not available. if your application supports resizing and app continuity then you should make sure to test that this works as users may wish to enable this functionality on their device. multi-window mode multi-window mode can allow up to three apps to run at the same time in a split-screen arrangement (see left image) or for two apps to run in pop-up mode which keeps one app full-screen but creates another app in a smaller pop-up window (see right image). just like app continuity, this is enabled using the attribute resizeableactivity in the android manifest. if resizeableactivity is not set then it defaults to true. so again, it is recommended that you set this yourself to ensure that your app works as intended. app continuity without multi-window mode if you would like your app to support app continuity but don't want it to be used by the multi-window mode, then this is possible. first, you should set the resizableactivity attribute to false to disable both app continuity and multi-window mode. next use the following <meta-data> element in your <activity> element: <meta-data android:name="android.supports_size_changes" android:value="true" /> this re-enables the app continuity feature without enabling multi-window mode. game engine features to use if using existing game engines then there are several existing features that are useful if developing for foldable devices. this section provides a high-level look at these features. for more in-depth information, i recommend visiting the engine's documentation. unity unity is a very good engine for developing on foldable devices, requiring little to no setup to enable features for foldable devices. the engine automatically handles resizing of the application without any modifications required. the resizeableactivity attribute is controlled by an option in the player settings: resizable window. when enabled, resizeableactivity appears to be set to true; when disabled, resizeableactivity appears to be set to false. note this option is not available in older versions of unity. if this option is not present then you have to set the resizeableactivity manually. the engine also provides a very robust ui scaling system that helps to reduce the amount of work required to create a ui that works across both the cover display and main display. the canvas scaler and anchor system work very well with foldable devices and work to resize and position elements consistently across the two displays, thus preventing developers from having to create two ui designs (one for cover display and one for main display). unreal engine 4 unreal engine 4 is another good engine for developing on foldable devices, although it requires a little more work to get it set up for foldable devices. first of all, unreal engine 4 by default disables the resize event on android devices, however, it is possible to re-enable it in device profiles using the console variable: r.enablenativeresizeevent = 1 unreal engine 4 by default also has a max aspect ratio of 2.1. this is, however, too small for the cover display of some foldable devices which can result in black bars on either side of the image. fortunately, unreal engine makes this value easily changeable in the project settings: platform -> android -> max aspect ratio from my testing, 2.8 is a good value. however, i recommend users experiment to find the best value. unreal engine's umg (unreal motion graphics ui designer) has a comprehensive set of tools for creating responsive ui designs. this system automatically scales ui elements to fit the screen size, and the anchor system is also very useful for ensuring the ui elements are positioned correctly between the two displays. samsung remote test lab no matter what engine you use, the best way to ensure your app works well on a foldable device is to test it on a foldable device. samsung remote test lab has a range of foldable devices available for developers to test their applications on if getting hold of a physical device is too difficult or expensive. all you need to do is create a samsung account and you can start using these devices for testing. android jetpack windowmanager despite being very good engines for foldable game development, neither unity nor unreal currently provide information about the current state of the foldable device (that is, is it open/closed/halfway?, where is the fold positioned?, what is the fold's orientation? and so forth). however, android has recently created a new library as a part of their android jetpack libraries that enables developers to access this information and make use of it in their apps. android jetpack in their own words is "a suite of libraries to help developers follow best practices, reduce boilerplate code, and write code that works consistently across android versions and devices so that developers can focus on the code they care about." windowmanager is a new library from the android jetpack suite, intended to help application developers support new device form factors and multi-window environments. the library had its 1.0.0 release in january 2022, and targeted foldable devices but—according to the documentation—future versions will extend to more display types and window features. more technical resources this blogpost has an accompanying technical blogpost and code lab tutorial, demonstrating how to use the android jetpack windowmanager with both unity and unreal to take advantage of the flex mode feature of samsung's foldable devices. i recommend starting with the jetpack windowmanager blogpost to learn how to set up the windowmanager in java: https://developer.samsung.com/galaxy-gamedev/blog/en-us/2022/07/20/how-to-use-jetpack-window-manager-in-android-game-dev then, follow it up with the code labs to learn how to make use of the windowmanager to implement a simple flex mode setup in either unreal or unity: https://developer.samsung.com/codelab/gamedev/flex-mode-unreal.html https://developer.samsung.com/codelab/gamedev/flex-mode-unity.html also, visit our game developer community to join the discussion around mobile gaming and foldable devices. click here to find out more about other design considerations when designing apps for foldable devices and large screens. final thoughts foldable devices provide a richer experience than regular phones. hopefully, this blogpost and accompanying tutorial have provided you with the necessary information to begin taking advantage of these new form factors and start providing users a richer foldable experience. additional resources on the samsung developers site the samsung developers site has many resources for developers looking to build for and integrate with samsung devices and services. stay in touch with the latest news by creating a free account and subscribing to our monthly newsletter. visit the galaxy store games page for information on bringing your game to galaxy store and visit the marketing resources page for information on promoting and distributing your android apps. finally, our developer forum is an excellent way to stay up-to-date on all things related to the galaxy ecosystem.
Sep 16, 2022
Lochlann Henry Ramsay-Edwards
tutorials iot
blogmatter is an open-source connectivity standard for smart home and internet of things (iot) devices. it is a secure, reliable, and seamless cross-platform protocol for connecting compatible devices and systems with one another. smartthings provides the matter virtual device application and the smartthings home apis to help you quickly develop matter devices and use the smartthings ecosystem without needing to build your own iot ecosystem. when matter was first introduced, because few devices supported it, platform companies struggled to test and optimize matter support on their own devices. to alleviate this issue, samsung smartthings developed the matter virtual device application, which can be used to test and optimize various device types, including those that have not yet been released. the matter virtual device is part of the matter open source project you can create and test matter devices virtually too. in this tutorial, you will learn how to create an occupancy sensor as a matter virtual device that you can control using the smartthings application. you will also gain an understanding of the concept of clusters on matter devices. for more information about smartthings matter, see matter in smartthings. prerequisites to follow along with this tutorial, you need the following hardware and software: host pc running windows 10 or higher or ubuntu 20.04 (x64) android studio (latest version recommended) java se development kit (jdk) 11 or higher devices connected to the same network: mobile device running android 8.0 oreo or higher mobile device with the smartthings application installed matter-enabled smartthings station onboarded with the same samsung account as the smartthings application ※ for ubuntu, to set up the development environment: step 1. turn on developer mode and enable usb debugging on your mobile device. step 2. install the required os-specific dependencies from your terminal: $ sudo apt-get install git gcc g++ pkg-config libssl-dev libdbus-1-dev \ libglib2.0-dev libavahi-client-dev ninja-build python3-venv python3-dev \ python3-pip unzip libgirepository1.0-dev libcairo2-dev libreadline-dev step 3. from the sdk manager in android studio, install sdk platform 26 and ndk version 22.1.7171670. step 4. register the ndk path to the path environment variable: export android_ndk_home=[ndk path] export path=$path:${android_ndk_home} step 5. install kotlin compiler (kotlinc) version 1.5.10. step 6. register the kotlinc path to the path environment variable. export kotlinc_home=[kotlinc path]/bin export path=$path:${kotlinc_home} implement the occupancy sensor device type to implement the occupancy sensor device type in the matter virtual device application: step 1. download the sample application project and open it in android studio. sample code - virtual device app (11.11mb) nov 21, 2023 step 2. build and run the sample application. the application implements an on/off switch device type as an example. step 3. go to the file path feature > main > java > com.matter.virtual.device.app.feature.main. step 4. to add the occupancy sensor device type to the application home screen, in the mainfragment.kt file, select the device type: notefor all code examples in this tutorial, look for "todo #' in the sample application to find the location where you need to add the code. // todo 1 mainuistate.start -> { val itemlist = listof( device.onoffswitch, // sample device type // todo 1 device.occupancysensor, // add the occupancy sensor ) retrieve cluster values clusters are the functional building block elements of the data model. a cluster can be an interface, a service, or an object class, and it is the lowest independent functional element in the data model. each matter device supports a defined set of relevant clusters that can interact with your preferred controller (such as smartthings). this allows for easy information retrieval, behavior setting, event notifications, and more. the following steps are implemented in the file occupancysensorviewmodel.kt at the path feature > sensor > java > com.matter.virtual.device.app.feature.sensor. to retrieve the relevant cluster values from the device through the viewmodel: step 1. retrieve the current occupancy status. the boolean value is used by occupancyfragment to update the ui. // todo 2 private val _occupancy: stateflow<boolean> = getoccupancyflowusecase() val occupancy: livedata<boolean> get() = _occupancy.aslivedata() step 2. retrieve the current battery status. the integer value is used by occupancyfragment to update the ui. //todo 3 private val _batterystatus: mutablestateflow<int> = getbatpercentremainingusecase() as mutablestateflow<int> val batterystatus: livedata<int> get() = _batterystatus.aslivedata() step 3. when the "occupancy" button in occupancyfragment is pressed, use the setoccupancyusecase() function to update the occupancy status boolean value. // todo 4 viewmodelscope.launch { timber.d("current value = ${_occupancy.value}") if (_occupancy.value) { timber.d("set value = false") setoccupancyusecase(false) } else { timber.d("set value = true") setoccupancyusecase(true) } } step 4. when the "battery" slider in occupancyfragment is moved, store the slider progress as the battery status. // todo 5 batterystatus.value = progress step 5. when the "battery" slider in occupancyfragment is moved, use the updatebatteryseekbarprogress() function to update the battery status value on the slider. use setbatpercentremainingusecase() to update the battery status integer value. // todo 6 viewmodelscope.launch { updatebatteryseekbarprogress(progress) setbatpercentremainingusecase(progress) } monitor cluster values the observe() function monitors for and reacts to changes in a cluster value. the following steps are implemented in the file occupancysensorfragment.kt at the path feature > sensor > java > com.matter.virtual.device.app.feature.sensor. to monitor changes in the virtual occupancy sensor’s cluster values: step 1. trigger updating the occupancy status of the virtual device when the ”occupancy” button is pressed. // todo 7 binding.occupancybutton.setonclicklistener { viewmodel.onclickbutton() } step 2. use the onprogresschanged() function to update the fragment ui through live data from the viewmodel. the onstoptrackingtouch() function triggers updating the battery status when touch tracking on the “battery” slider stops. // todo 8 binding.occupancysensorbatterylayout.titletext.text = getstring(r.string.battery) binding.occupancysensorbatterylayout.seekbardata = seekbardata(progress = viewmodel.batterystatus) binding.occupancysensorbatterylayout.seekbar.setonseekbarchangelistener( object : seekbar.onseekbarchangelistener { override fun onprogresschanged(seekbar: seekbar, progress: int, fromuser: boolean) { viewmodel.updatebatteryseekbarprogress(progress) } override fun onstarttrackingtouch(seekbar: seekbar) {} override fun onstoptrackingtouch(seekbar: seekbar) { viewmodel.updatebatterystatustocluster(seekbar.progress) } } ) step 3. monitor the occupancy status and update the fragment ui when it changes. // todo 9 viewmodel.occupancy.observe(viewlifecycleowner) { if (it) { binding.occupancyvaluetext.text = getstring(r.string.occupancy_state_occupied) binding.occupancybutton.setimageresource(r.drawable.ic_occupied) } else { binding.occupancyvaluetext.text = getstring(r.string.occupancy_state_unoccupied) binding.occupancybutton.setimageresource(r.drawable.ic_unoccupied) } } step 4. monitor the battery status and update the fragment ui when it changes. // todo 10 viewmodel.batterystatus.observe(viewlifecycleowner) { val text: string = getstring(r.string.battery_format, it) binding.occupancysensorbatterylayout.valuetext.text = html.fromhtml(text, html.from_html_mode_legacy) } test the virtual device to test the virtual occupancy sensor device: step 1. build and run the project on your android device. the application’s home screen shows the sample on/off switch device type and the occupancy sensor device type that you just implemented. step 2. to create a virtual occupancy sensor device, select occupancy sensor and follow the instructions to receive a qr code. step 3. within the smartthings application on the other mobile device, onboard the occupancy sensor by scanning the qr code. step 4. change the occupancy or battery status of the virtual occupancy sensor in the virtual device application. the values are synchronized to the smartthings application. conclusion this tutorial demonstrates how you can implement the occupancy sensor device type in the matter virtual device application and create a virtual occupancy sensor for testing purposes. to learn about implementing other device types, go to code lab (matter: create a virtual device and make an open source distribution). the code lab also describes how to contribute to the matter open source project.
Dec 14, 2023
HyoJung Lee
events game, uiux, health, mobile, galaxy watch, smarttv, marketplace, foldable
blogit's almost time again for the samsung developer conference (aka sdc). after a few years of limited travel, virtual-only events, and nasal swabs, we're excited to get back to a live event, if only for one day. as you may have seen on this site and in our social media accounts, sdc is scheduled for october 12, at the moscone conference center in san francisco. the keynote address and technical sessions from the live event will be streamed via youtube, and there will be additional virtual content available to view beginning october 13, on the conference web site. sdc is a special time for us at the samsung developers team worldwide. internally, we have members in many countries: south korea, philippines, the united kingdom, vietnam, brazil, poland, bangladesh, and the united states. we're all excited to come together each year to bring useful and informative content to you. we enjoy planning and staging the event each year and we all have great memories. in this post, we'll share some of our favorite experiences with you. behind the scenes we understand the boring details behind the scenes isn't as exciting to most, but for our events team, the challenge of planning and pulling off a major event is a thrill. most years, planning for sdc begins almost the moment the previous event is completed. the planners begin executing contracts with the venue, caterers, and the experiential vendor that handles all the small details. they meet weekly on a conference call, usually in the evenings for those in the us to talk with our counterparts in south korea. as the months move along, the teams secure contractors with audio-visual teams to run the massive wall of displays on the keynote stage, as well as the cameras in the rooms where sessions are held. we look through dozens of clothing catalogs to find apparel for attendees as well as employees. during these times, we plan out the social media notices, come up with topics for blog posts, and design the email campaigns that go out to hundreds of thousands of developers. we discuss the colors of signs in great detail to make sure each shade is exactly the right color. we talk to the technical teams to understand which new tech is ready for developers to use. we track down guest speakers from other companies to show how samsung works with so many partners. as the months go by, we're working more and more on sdc tasks. a few days prior to showtime, you will see many of our team on-site at the venue, setting up demo stations, wiping down counters, and making sure everything is "just right," so that you're amazed the moment you walk through those doors. best of galaxy store awards beginning in 2018, the samsung developers team started the best of galaxy store awards program as a way to show our appreciation for those developers and designers who bring their products to galaxy store. some of the top names in the world of gaming, productivity, and social media have received the awards. to name a few: riot games' league of legends: wild rift (2021 best strategy game) epic games' fortnite (2020 game of the year) spotify (2020 bixby capsule of the year) gameloft asphalt 9 (2019 best racing game) tiktok (2019 best social app) disney heroes (2018 best game) best of galaxy store winners at sdc while the 2018 and 2019 awards were presented live at sdc, the covid pandemic required us to make the awards show a virtual event. you can watch the 2020 and 2021 awards on youtube. the 2022 awards are scheduled for december 6, 2022, on youtube. stay tuned to the samsung developers channels on twitter, facebook, linkedin, and youtube to watch the awards show when it premieres. for a full list of samsung best of galaxy store awards winners, click here. before and after hours sdc isn't just all business though—the organizers and attendees also love to have fun. before the keynote, attendees could join groups stretching and going for a run through the city streets. others opted for calm and quiet with yoga at the expo hall stage. staying fit at sdc a tech conference wouldn't be complete without an after-hours party and sdc is no different. in 2014, the sdc crowd went to san francisco's famed exploratorium, where they experimented with science projects among tables of excellent food and drinks. attendees in 2016 experienced the beautiful asian art museum in san francisco. in 2017, attendees walked virtually through a murder mystery with samsung gear vr headsets. the 2019 sdc brought more excitement to the after-hours party. as the event was held at the end of october, the theme was "day of the dev," playing on the name of the holiday celebrated in november. a vr-powered interactive journey led players through a spooky experience, while halloween-themed goodies were available throughout the expo hall. a foggy mist permeated the place, while characters from the vr journey appeared in real life to the delight of the crowd. day of the dev at sdc technical sessions no matter your interests, if you're into tech, sdc probably has a topic that will pique your curiosity. whether you are into mobile design, voice control, smart devices, enterprise security, streaming media, or a good old-fashioned shoot-em-up game, you will find plenty of technical sessions to satisfy your need to know more. foldables and security are popular topics at sdc after you've taken in the experts' ideas in the technical sessions, you can start building your own ideas on samsung's platforms in the code lab area of the expo hall. at the code lab, you can talk with the actual developers of many of samsung's best products and learn from them. this interactive experience is also available to anyone at any time on the samsung developers web site. keynote and spotlight session the keynote presentation is always the subject of great speculation on the internet for days prior to sdc. samsung electronics president and ceo dj koh is a favorite each year. do you remember the 2018 sdc at moscone west, when senior vice president justin denison walked on stage to present the galaxy fold, the first foldable mobile device? that is one of many exciting moments on the first day of the event. revealing the galaxy fold the sdc keynote includes other memorable moments, such as the announcement of one ui, which brought a sensible and consistent design ethos to samsung mobile and wearable devices. bixby cto adam cheyer used his time on stage to show the power of ai, while also amazing and amusing the audience with his sleight-of-hand. the second day starts off with the spotlight session. the spotlight session is where samsung focuses on companies making a successful partnership with samsung platforms and products. compared to the flash and excitement of the keynote, the spotlight session is where executives and developers share their experiences with their peers. spotlight session highlights—behind the scenes with vitalik buterin; on stage with tim sweeney and john hanke some of the more memorable moments of the spotlight sessions include hosting a fireside chat of games and how to monetize them on mobile platforms. this session included epic games ceo tim sweeney and niantic labs ceo john hanke. another memorable spotlight session brought ethereum co-founder vitalik buterin to the stage with samsung's john jun to talk about the possibilities of blockchain for mobile devices. one final memory of the sdc spotlight session that's special to those of us on the samsung developers team, aside from putting all this together for you, was from 2018. the week before the event, one of the teams' members proposed to use the note9 with s pen to take a photo on stage and send the photo to twitter. in three days, they had the prototype working and tested. the team convinced vp yoon lee to perform this live demo on stage to close the event. as you can see below, the demo worked flawlessly. wrapping up after sdc is finished, the samsung developers team is there to put everything away. demo machines are disassembled and our booth is sent to the warehouse. we pack up leftover t-shirts and make sure we left the venue cleaner than we found it. after that, we have a well-earned party for ourselves and a good night's sleep. thank you for reading through our memories of sdc events in the past. please join us online, october 12, 2022, at 10 am pt, for the next samsung developer conference keynote. let us know your favorite moments from sdc. join us on twitter, facebook, linkedin, and youtube to continue the discussion.
Oct 5, 2022
Learn Code Lab
codelabmeasure blood oxygen level and heart rate on galaxy watch objective create a health app for galaxy watch, operating on wear os powered by samsung, utilizing samsung health sensor sdk to trigger and obtain results of simultaneous blood oxygen level spo2 and heart rate measurements overview samsung health sensor sdk provides means of accessing and tracking health information contained in the health data storage its tracking service gives raw and processed sensor data such as accelerometer and body composition data sent by the samsung bioactive sensor the latest bioactive sensor of galaxy watch runs powerful health sensors such as photoplethysmogram ppg , electrocardiogram ecg , bioelectrical impedance analysis bia , sweat loss, and spo2 see samsung health sensor sdk descriptions for detailed information set up your environment you will need the following galaxy watch4 or newer android studio latest version recommended java se development kit jdk 11 or later sample code here is a sample code for you to start coding in this code lab download it and start your learning experience! measuring blood oxygen level and heart rate sample code 159 2 kb connect your galaxy watch to wi-fi go to settings > connection > wi-fi and make sure that wi-fi is enabled from the list of available wi-fi networks, choose and connect to the same one as your pc turn on developer mode and adjust its settings on your watch, go to settings > about watch > software and tap on software version 5 times upon successful activation of developer mode, a toast message will display as on the image below afterwards, developer options will be visible under settings tap developer options and enable the following options adb debugging in developer options find wireless debugging turn on wireless debugging check always allow on this network and tap allow go back to developer options and click turn off automatic wi-fi notethere may be differences in settings depending on your one ui version connect your galaxy watch to android studio go to settings > developer options > wireless debugging and choose pair new device take note of the wi-fi pairing code, ip address & port in android studio, go to terminal and type adb pair <ip address> <port> <wi-fi pairing code> when prompted, tap always allow from this computer to allow debugging after successfully pairing, type adb connect <ip address of your watch> <port> upon successful connection, you will see the following message in android studio’s terminal connected to <ip address of your watch> now, you can run the app directly on your watch turn on developer mode for health platform on your watch go to settings > apps > health platform quickly tap health platform title for 10 times this enables developer mode and displays [dev mode] below the title to stop using developer mode, quickly tap health platform title for 10 times to disable it start your project in android studio, click open to open existing project locate the downloaded android project from the directory and click ok establish service connection and check capabilities for the device to track data with the samsung health sensor sdk, it must connect to the service by healthtrackingservice api after establishing a connection, verify if the required tracker type is available to check this, get the list of available tracker types and verify that the tracker is on the list in this code lab, you will utilize blood oxygen level and heart rate trackers the healthtrackingservice api usage is in the table below healthtrackingservicehealthtrackingservice initiates a connection to samsung's health tracking service and provides a healthtracker instance to track a healthtrackertype public void connectservice establish a connection with samsung's health tracking service public void disconnectservice release a connection for samsung's health tracking service public healthtrackercapability gettrackingcapability provide a healthtrackercapability instance to get a supporting health tracker type list initialize multiple trackers before the measurement starts, initialize the spo2 tracker by obtaining the proper health tracker object in the connectionmanager java file, navigate to initspo2 , create an oxygen saturation healthtracker instance, and pass it to the spo2listener instance get the healthtracker object using the healthtrackingservice api use the healthtrackertype spo2_on_demand type as parameter healthtrackingservicehealthtrackingservice initiates a connection to samsung's health tracking service and provides a healthtracker instance to track a healthtrackertype public healthtracker gethealthtracker healthtrackertype healthtrackertype provide a healthtracker instance for the given healthtrackertype pass the healthtracker object to the spo2listener instance object spo2listener public void sethealthtracker healthtracker tracker set healthtracker instance for the given tracker /******************************************************************************************* * [practice 1] create blood oxygen level health tracker object * - get health tracker object * - pass it to spo2listener ------------------------------------------------------------------------------------------- * - hint replace todo 1 with parts of code * 1 get healthtracker object using healthtrackingservice gethealthtracker * use healthtrackertype spo2_on_demand type as parameter * 2 pass it to spo2listener using sethealthtracker function ******************************************************************************************/ public void initspo2 spo2listener spo2listener { //"todo 1 1 " //"todo 1 2 " sethandlerforbaselistener spo2listener ; } next, in the connectionmanager java file, in the initheartrate function, create a heart rate healthtracker instance, and pass it to the heartratelistener instance get the healthtracker object using the healthtrackingservice api use the healthtrackertype heart_rate_continuous type as parameter pass the healthtracker object to the heartratelistener instance object heartratelistener public void sethealthtracker healthtracker tracker set healthtracker instance for the given tracker /******************************************************************************************* * [practice 2] create heart rate health tracker object * - get health tracker object * - pass it to heartratelistener ------------------------------------------------------------------------------------------- * - hint replace todo 2 with parts of code * 1 get healthtracker object using healthtrackingservice gethealthtracker * use healthtrackertype heart_rate_continuous type as parameter * 2 pass it to heartratelistener using sethealthtracker function ******************************************************************************************/ public void initheartrate heartratelistener heartratelistener { //"todo 2 1 " //"todo 2 2 " sethandlerforbaselistener heartratelistener ; } start and stop trackers for the client app to start obtaining the data through the sdk, set a listener method on healthtracker this method will be called every time there is new data after the measurement completes, the listener has to be disconnected to start measurement in the baselistener java file, navigate to starttracker function, and set trackereventlistener as listener healthtracker object set an event listener on healthtracker object using healthtracking api use the healthtracker trackereventlistener object instance as parameter healthtrackerhealthtracker enables an application to set an event listener and get tracking data for a specific healthtrackertype public void seteventlistener healthtracker trackereventlistener listener set an event listener to this healthtracker instance /******************************************************************************************* * [practice 3] start health tracker by setting event listener * - set event listener on health tracker ------------------------------------------------------------------------------------------- * - hint replace todo 3 with parts of code * set event listener on healthtracker object using healthtracker seteventlistener * use trackereventlistener object as parameter ******************************************************************************************/ public void starttracker { log i app_tag, "starttracker called " ; log d app_tag, "healthtracker " + healthtracker tostring ; log d app_tag, "trackereventlistener " + trackereventlistener tostring ; if !ishandlerrunning { handler post -> { //"todo 3" sethandlerrunning true ; } ; } } to stop measurement, unset the trackereventlistener from the healthtracker object in the stoptracker function unset the event listener on healthtracker object using healthtracking api healthtrackerhealthtracker enables an application to set an event listener and get tracking data for a specific healthtrackertype public void unseteventlistener stop the registered event listener to this healthtracker instance /******************************************************************************************* * [practice 4] stop health tracker by removing event listener * - unset event listener on health tracker ------------------------------------------------------------------------------------------- * - hint replace todo 4 with parts of code * unset event listener on healthtracker object using healthtracker unseteventlistener ******************************************************************************************/ public void stoptracker { log i app_tag, "stoptracker called " ; log d app_tag, "healthtracker " + healthtracker tostring ; log d app_tag, "trackereventlistener " + trackereventlistener tostring ; if ishandlerrunning { //"todo 4" sethandlerrunning false ; handler removecallbacksandmessages null ; } } process obtained and batching data the response from the platform will be asynchronous with the results you want to obtain follow the steps below to get blood oxygen level and heart rate data in the spo2listener java file, navigate to updatespo2 function, and read spo2 data from datapoint get the oxygen saturation status using the datapoint api key valuekey spo2set status get the oxygen saturation value using the datapoint api key valuekey spo2set spo2 datapointdatapoint provides a map of valuekeyand value with a timestamp public <t>t getvalue valuekey<t> type get data value for the given key /******************************************************************************************* * [practice 5] read values from datapoint object * - get blood oxygen level status * - get blood oxygen level value ------------------------------------------------------------------------------------------- * - hint replace todo 5 with parts of code * 1 remove spo2status calculating and * set status from 'datapoint' object using datapoint getvalue valuekey spo2set status * 2 set spo2value from 'datapoint' object using datapoint getvalue valuekey spo2set spo2 * if status is 'spo2status measurement_completed' ******************************************************************************************/ public void updatespo2 datapoint datapoint { int status = spo2status calculating; //"todo 5 1 " int spo2value = 0; //"todo 5 2 " trackerdatanotifier getinstance notifyspo2trackerobservers status, spo2value ; log d app_tag, datapoint tostring ; } in the heartratelistener java file, navigate to readvaluesfromdatapoint function, and read the heart rate data from datapoint get heart rate status using datapoint api key valuekey heartrateset heart_rate_status get heart rate value using datapoint api key valuekey heartrateset heart_rate get heart rate ibi value using datapoint api key valuekey heartrateset ibi_list get ibi quality using datapoint api key valuekey heartrateset ibi_status_list /******************************************************************************************* * [practice 6] read values from datapoint object * - get heart rate status * - get heart rate value * - get heart rate ibi value * - check retrieved heart rate’s ibi and ibi quality values ------------------------------------------------------------------------------------------- * - hint replace todo 6 with parts of code * 1 set hrdata status from 'datapoint' object using datapoint getvalue valuekey heartrateset heart_rate_status * 2 set hrdata hr from 'datapoint' object using datapoint getvalue valuekey heartrateset heart_rate * 3 set local variable 'list<integer> hribilist' using datapoint getvalue valuekey heartrateset ibi_list * 4 set local variable 'final list<integer> hribistatus' using datapoint getvalue valuekey heartrateset ibi_status_list * 5 set hrdata ibi with the last of 'hribilist' values * 6 set hrdata qibi with the last of 'hribistatus' values ******************************************************************************************/ public void readvaluesfromdatapoint datapoint datapoint { heartratedata hrdata = new heartratedata ; //"todo 6 1 " //"todo 6 2 " //"todo 6 3 " //"todo 6 4 " //"todo 6 5 " //"todo 6 6 " trackerdatanotifier getinstance notifyheartratetrackerobservers hrdata ; log d app_tag, datapoint tostring ; } run unit tests for your convenience, you can find an additional unit tests package this lets you verify your code changes even without using a physical watch see instructions below on how to run unit tests right click on com samsung health multisensortracking test and execute run 'tests in 'com samsung health multisensortracking'' command if you completed all the tasks correctly, you can see that all the unit tests passed successfully run the app after building the apk, you can run the application on a connected device to see blood oxygen level and heart rate values right after the app is started, it requests for user permission allow the app to receive data from the body sensors afterwards, it shows the application's main screen and automatically display the heart rate to get the blood oxygen level spo2 value, tap on the measure button to stop the measurement, tap on the stop button tap on the details label to see more heart rate data you're done! congratulations! you have successfully achieved the goal of this code lab now, you can create a health app that measures blood oxygen level and heart rate by yourself! if you're having trouble, you may download this file measuring blood oxygen level and heart rate complete code 158 8 kb to learn more about samsung health, visit developer samsung com/health
Learn Code Lab
codelabapply gyro effects to a watch face using watch face studio objective learn how to change the appearance of your watch face as the watch tilts in different directions by applying gyro effects using watch face studio overview watch face studio is a graphical authoring tool that enables you to design watch faces for the wear os smartwatch ecosystem, including galaxy watch4 and newer versions it provides easy and simple features, such as gyro effects, for creating attractive and dynamic watch faces without coding gyro effects are a feature in watch face studio, which uses the watch's gyro sensor to animate or alter the look of watch faces the effects are triggered when the watch is tilted within the -90° to 90° tilt angle range along the x and y axes see illustration below a tilt angle of 0° on both axes indicates that the watch rests on a flat surface the following shows the expected tilt angle values when the watch tilts 90° in different directions direction of tilt tilt angle clockwise along the y-axis or tilt forward y = -90° counterclockwise along the y-axis or tilt backward y = 90° clockwise along the x-axis or tilt to the right x = 90° counterclockwise along the x-axis or tilt to the left x = -90° set up your environment you will need the following watch face studio latest version galaxy watch4 or newer smart watch running on wear os api 30 or higher sample project here is a sample project for you to start coding in this code lab download it and start your learning experience! gyro effects sample project 322 00 kb start your project to load the sample project in watch face studio click open project locate the downloaded file and click open the sample project is a premade watch face design it contains a background image, a digital clock, and a date notein watch face studio, you can add a digital clock component in different formats such as hh, mm, ss, or hh mm ss setting the digital time properties, such as color and dimension, per digit is also possible when you choose a component from variable options the project also contains three ellipses two ellipses with tag expressions in rotate > angle properties to represent the analog time hour and second and one ellipse as the background of the digital hour in the style tab, you can see the watch face's different theme color palettes resize a component when tilted forward or backward as the watch tilts, a component can enlarge or shrink relative to its pivot when applying gyro effects the pivot is the point at the center of the component the component remains at the same position as it gets bigger or smaller select the time background component in properties, you can see that the component's dimension width x height is 100px by 100px select apply gyro to show the properties of gyro effects initially, a component does not resize, move, rotate, or change opacity when you tilt the watch in any direction the gyro effect occurs when you set the changing to properties higher or lower than the default values to resize the component when the watch tilts forward or backward, go to y-axis properties and set the scale width and height to the following 70% for -90° to 0° range 130% for 0° to 90° range in the run pane, you can simulate the rotation of the watch the component enlarges from 100% to 130% as the watch tilts backward and shrinks to 70% as the watch tilts forward notewhen you drag the gyro joystick upward, it simulates the backward rotation of the watch similarly, when you move the joystick downward, it simulates the forward rotation rotate a component when tilted to the left or right in gyro mode, a component can rotate around its pivot when a component rotates, its position remains the same select the digital date component the format of the digital date is a curved text, and its pivot is at the center of the canvas inner pivot x = 225px, inner pivot y = 225px enable gyro effects then, change the rotation angle to -90° for the following tilt angle ranges along the x-axis -90° to 0° 0° to 90° as the watch tilts to the left or right, the date component rotates up to 90° gradually counter-clockwise change the appearance of the background image based on the adjusted tilt angle range range values define how tilted the watch should be for a component to resize, rotate, appear, or move to the maximum value you set for example, when you set the range from 0° to 45° along the x-axis and the scale to 200%, the component will enlarge twice from its original size once the watch reaches the 45° tilt angle to the right select the background image change the opacity of the components to 0° for the following ranges -45° to 0° along the x-axis 0° to 45° along the x-axis -45° to 0° along the y-axis as the watch tilts in any direction other than backward, the background image slowly disappears as it approaches the 45° tilt angle then, for the 0° to 45° range along the y-axis, change the rotation angle to 360° to rotate the background image clockwise as the watch tilts backward test the watch face in the run pane, click the preview watch face icon move the gyro simulator joystick to see how the appearance of time background, digital date, and background image changes to test your watch face on a smartwatch, you need to connect a watch device to the same network as your computer in watch face studio, select run on device select the connected watch device you want to test with if your device is not detected automatically, click scan devices to detect your device or enter its ip address manually by clicking the + button notethe always-on display is already set in the project to run the watch face successfully, you may need to configure its always-on display to avoid any error when you click run on device you’re done! congratulations! you have successfully achieved the goal of this code lab now, you can apply gyro effects using watch face studio to change the appearance of your watch face all by yourself! if you’re having trouble, you may download this file gyro effects complete project 322 02 kb to learn more about watch face studio, visit developer samsung com/watch-face-studio
Learn Code Lab
codelabcreate an android automotive operating system aaos app with payments via samsung checkout objective create a shopping app for android automotive os aaos , which uses templates from aaos and ignite store, and processes payments via the ignite payment sdk powered by samsung checkout partnership request to use the ignite payment sdk and have access to development tools and resources, such as ignite aaos emulators, you must become an official partner once done, you can fully utilize this code lab you can learn more about the partnership process by visiting the ignite store developer portal overview android automotive os android automotive os aaos is a base android platform that runs directly on the car and is deeply integrated with the underlying vehicle hardware unlike the android auto platform, users can download compatible apps with aaos directly into their cars, without needing a phone, and utilize an interface specifically designed for the car screen aaos can run both system and third-party android applications as aaos is not a fork and shares the same codebase as android for mobile devices, developers can easily adapt existing smartphone apps to function on aaos the diagram below illustrates the architecture of aaos at the hardware abstraction layer hal level, aaos incorporates additional components such as the vehicle hal vhal , exterior view system evs , and broadcast radio br to handle vehicle properties and connectivity at the framework level, car service and webview modules are included at the application level, the main system applications include car system ui, car launcher, and car input method editor ime additionally, car media and automotive host are incorporated as system apps third-party apps are classified into three categories audio media, car templated, and parked car templates the car templated apps use templates specified by the car app library, which are rendered by the automotive host, customized by original equipment manufacturers oem the library consists of approximately 10 templates list, grid, message, pane, navigation and is utilized in both android auto aa and android automotive os aaos apps to target aaos, you must incorporate an additional app-automotive library that injects the carappactivity into the app the carappactivity needs to be included in the manifest and can be set as distractionoptimized upon launching the application, the carappactivity provides a surface that is employed by the automotive host to render the template models additionally, on the harman ignite store, you can optionally integrate the ignite-car-app-lib, which adds supplementary templates such as explore, listdetails, routeoverview, and poistreaming harman ignite store the harman ignite store is a white-labeled and modular aaos-based automotive app store by connecting app developers with car manufacturers, harman creates unique in-vehicle experiences the ignite store has a rich app ecosystem with unique content, growth opportunities, and long-term partnerships it facilitates future-proof monetization with a payments api powered by samsung checkout after registering at the ignite store developer portal, developers can submit their apps for certification and testing by harman upon approval from the oem, the developer can proceed with publishing their app comprehensive developer documentation and tools are available to support app developers throughout the development process payments api the ignite store comes enabled with payment features empowering developers to monetize their apps developers are now able to offer their apps as paid apps the payment sdk exposes apis for goods and services, in-app purchases, and subscriptions developers can also integrate their own payment service providers psps , to sell goods or services, and receive the money directly in their bank account for a frictionless in-car payment experience, ignite provides a dedicated digital wallet app for end-users to securely store their credit card information the payment processor is powered by the industry proven samsung checkout the developer portal provides additional documentation to allow developers to access ignite aaos emulators, vim3, tablet or cuttlefish ignite images, and additional guidelines set up your environment notefor sdc24 attendees, skip this step as it's already done for you proceed to 5 check ignite payment sdk dependency you will need the following ignite aaos system image running on android emulator or on reference devices android studio latest version recommended java se development kit jdk 11 or later sample code here is a sample code for you to start coding in this code lab download it and start your learning experience! aaos ignite shopping app sample code 11 7 mb prepare your ignite aaos emulator add the ignite aaos emulator to your android studio by following the guide provided in the ignite store developer console open the device manager and start the emulator configure the emulator to use fake data for payments, as instructed in the ignite store developer console under the offline mode tab in the payment sdk section the sample app requires navigation from point a starting location to point b destination location the destination addresses are predefined and near san jose mcenery convention center to shorten the distance between two locations, follow the steps below to set the starting location a open the extended controls in the emulator panel b go to location, search for a location near the destination location, and click set location next, in the emulator, go to the ignite navigation app's settings and enable the following enable navigation simulation enable mock location provider go to settings > system > developer options > location and set ignite navigation as mock location app start your project after downloading the sample code containing the project files, open your android studio and click open to open an existing project locate the downloaded android project igniteautomotivepaymentssdc202488 from the directory and click ok check the ignite payment sdk dependency verify that the ignite payment sdk library is included in the dependencies section of the module's build gradle file dependencies { implementation files 'libs/ignite-payment-sdk-3 13 0 24030417-0-snapshot aar' } add payment permission next, go to the manifests folder and, in the androidmanifest xml file, include the payment_request permission to perform in-app purchases <uses-permission android name="com harman ignite permission payment_request" /> this ensures that the app has the necessary permissions to perform transactions and handle sensitive financial data show the payment screen when an item is added to the cart, the shopping cart screen displays the select store button, selected pickup store address, total amount to pay, and details of each item added the screen also includes the pay button go to kotlin + java > com harman ignite pickupstore > screens > shoppingcartscreen kt in the docheckout function, use the car app's screenmanager to navigate to the payment screen from the shopping cart screen after clicking the pay button getscreenmanager push paymentscreen carcontext, session notethe screenmanager class provides a screen stack you can use to push screens that can be popped automatically when the user selects a back button in the car screen or uses the hardware back button available in some cars instantiate the ignite payment client the ignite payment api provides a singleton class called ignitepaymentclientsingleton, which enables performing and tracking transactions navigate to the paymentscreen kt file and instantiate the ignite payment client private val mipc = ignitepaymentclientsingleton getinstance carcontext define the ignite payment transaction callback the ignite payment transaction provides three callback methods onsuccess, oncanceled, and onfailure after each callback, make sure to set the ispaymentfailed variable to track whether a payment is successful or not update the session which owns the shopping cart screen to reflect the status of the payment transaction call the updatetemplate function to invalidate the current template and create a new one with updated information private val mipctcb = object iignitepaymentclienttransactioncallback { override fun onsuccess requestuuid string?, sessionid string?, successmessage string?, paymentadditionalproperties hashmap<string, string>? { log d tag, log_prefix + "onsuccess rid $requestuuid, sid $sessionid, sm $successmessage" cartoast maketext carcontext, "payment successful", cartoast length_short show ispaymentfailed = false session paymentdone requestuuid, sessionid, successmessage updatetemplate } override fun oncanceled requestuuid string?, sessionid string? { log d tag, log_prefix + "oncanceled rid $requestuuid, sid $sessionid" cartoast maketext carcontext, "payment canceled", cartoast length_long show ispaymentfailed = true session paymenterror requestuuid, sessionid, null updatetemplate } override fun onfailure requestuuid string?, sessionid string?, wallererrorcode int, errormessage string { log d tag, log_prefix + "onfailure rid $requestuuid, sid $sessionid, wec $wallererrorcode, em $errormessage" cartoast maketext carcontext, "payment failed", cartoast length_long show ispaymentfailed = true session paymenterror requestuuid, sessionid, errormessage updatetemplate } } define the ignite payment client connection callback the ignite payment client needs to be connected in order to perform a payment request once the client connects successfully, retrieve the names of the shopping cart items and use them to create an order summary afterwards, construct an ignite payment request containing the total amount, currency code, merchant id, and details of the order summary then, initiate the payment process by invoking the readytopay function of the ignite payment client api private val mipccb = iignitepaymentclientconnectioncallback { connected -> log d tag, log_prefix + "onpaymentclientconnected $connected" if connected { val textsummary = session shoppingcart cartitems jointostring ", " { item -> item name } val ipr = ignitepaymentrequest builder setamount session shoppingcart gettotal * 100 setcurrencycode currencycode usd setpaymentoperation paymentoperation purchase setmerchantid constants merchant_id setordersummary ordersummary builder setordersummarybitmapimage bitmapfactory decoderesource carcontext resources, session shoppingcart store largeicon setordersummarylabel1 "${carcontext getstring r string pickupstore_app_title } ${session shoppingcart store title}" setordersummarysublabel1 session shoppingcart store address setordersummarylabel2 textsummary setordersummarysublabel2 carcontext getstring r string pickupstore_payment_sublabel2 build build try { mipc readytopay ipr, mipctcb } catch e exception { log d tag, log_prefix + "payment exception $e" e printstacktrace } catch e error { log d tag, log_prefix + "payment error $e" e printstacktrace } } } start the payment process and go back to previous screen after the transaction next, in the startpayment function, connect the ignite payment client and the connection callback to start the payment process mipc connect mipccb after the transaction is completed, the updatetemplate function refreshes the template used in the payment screen before calling the schedulegoback function modify the schedulegoback function to navigate back to the previous template screen shopping cart you can use the pop method of the screenmanager screenmanager pop start the navigation to the store to collect the paid pickup the shopping cart screen shows the pickup store location, details of the order, and go to store button after a successful payment go to kotlin + java > com harman ignite pickupstore > pickupstoresession kt modify the navigatetostore geofence geofence function to trigger the navigation to the pickup store when the go to store button is clicked you can use the intent carcontext action_navigate with geo schema rfc 5879 data, containing latitude and longitude e g , geo 12 345, 14 8767 to send the intent, use the carcontext startcarapp api call val geouristring = "geo ${geofence latitude},${geofence longitude}" val uri = uri parse geouristring val navintent = intent carcontext action_navigate, uri try { carcontext startcarapp navintent } catch e exception { log e tag, log_prefix + "navigatetostore exception starting navigation" e printstacktrace cartoast maketext carcontext, "failure starting navigation", cartoast length_short show } run the app on ignite aaos emulator run the pickup-store-app on ignite aaos emulator when the app starts for the first time, it requests for user permissions click grant permissions choose allow all the time for location permission and click the return button 4 browse the pickup store catalog and add items to shopping cart open the shopping cart and click pay you can also change the pickup store by clicking select store check the order summary and billing information then, click confirm and pay to process payment after a successful payment, the app returns to shopping cart screen with the updated transaction information click go to store to start the navigation to the store the app displays a notification when the car is near the store click the notification to show a reference qr code to present to the store upon pick up you're done! congratulations! you have successfully achieved the goal of this code lab topic now, you can create an aaos templated app, which supports payments by yourself! learn more by going to the developer console section of the ignite store developer portal
tutorials foldable, mobile
blogdue to the large variety of android devices, supporting multiple screen sizes and display modes is an essential part of modern android application development. a responsive and adaptable ui provides a seamless user experience for people using the application on different devices, such as phones, foldables, and tablets. this article offers some tips and tricks for developing adaptable applications that support features specific to foldable and large screen devices: displaying widgets and applications on the cover screen adapting applications to flex mode adapting applications to multi-window mode display widgets and applications on the cover screen using good lock one of the biggest strengths of android devices is customization. with good lock, users can customize one ui to cater to their individual needs, such as adding widgets to the cover screen. in august 2023, samsung released the galaxy z flip5, a foldable smartphone with a cover screen display. the 3.4-inch cover screen can be used to provide the user direct access to the most important information and functionality of their applications. it also offers new opportunities for customization and personalization. the cover screen is designed to display customizable widgets, such as clocks, wallpapers, notification lists, and calendars. for basic information about creating android widgets, see create a simple widget. you can also create widgets for your application that can be displayed on the cover screen. for example, if you are developing a messaging application, you can enhance your application by creating a widget that displays recent or unread messages on the cover screen. to enable displaying your widget on the galaxy z flip5 cover screen (also called the flex window): create a new xml resource file inside your application's res/xml folder. paste the following code to configure your widget's display for the flex window:<samsung-appwidget-provider display="sub_screen"> </samsung-appwidget-provider> in your project's manifest file, add the metadata for the xml file you just created inside the <receiver> element for your application widget:<meta-data android:name="com.samsung.android.appwidget.provider" android:resource="@xml/samsung_meta_info_sample_widget" /> in the xml file containing the metadata for your application widget, change the attributes of your widget to these recommended values:<appwidget-provider xmlns:android="http://schemas.android.com/apk/res/android" android:initiallayout="@layout/test_widget" android:minheight="352dp" android:minwidth="339dp" android:resizemode="vertical|horizontal" android:widgetcategory="keyguard"> </appwidget-provider> when the application is installed, your widget appears on the list of cover screen widgets on the device. the user can display the widget on the flex window by going to settings > cover screen > widgets and selecting the widget from the list. to learn more about widget development for the cover screen, see the develop a widget for flex window code lab. although the cover screen is primarily meant for widgets, users can also use the multistar application in good lock to launch full applications on the cover screen. to launch an application on the cover screen: install good lock from galaxy store. in the good lock application, go to life up > multistar > i ♡ galaxy foldable > launcher widget. from the list, select the application you want to launch on the cover screen and tap enable launcher widget. it is vital that your applications can adapt to small screens as much as large screens. the next two sections briefly summarize how to adapt your applications to both flex mode and multi-window mode. adapt your application to flex mode constraint layout is recommended for implementing responsive ui in your applications. constraint layout gives you a flat view hierarchy, avoiding troublesome nested views. you can assign a relative size and distance between viewgroups, so the components scale based on the device that the application is running on. let's say you want to create an application that displays a list of pictures in the middle of the screen, equally spaced. to do this, you can use guidelines, which are virtual guides that help you to align views. to add guidelines and adapt your ui to it: in android studio, go to helpers > guideline(horizontal) or guideline(vertical) and drag guidelines onto your layout. change the value of app:layout_constraintguide_begin from a static value to a percentage value. align your views to the new guidelines by constraining them to the guidelines. for more information on implementing layouts using guidelines and a demonstration, see design a single responsive layout for foldable phones using android guidelines. you can also use android's jetpack windowmanager library, which supports various form factors, including multi-window, to provide support for flex mode in your application: add the following dependencies to the application's build.gradle file: implementation "androidx.window:window:1.0.0" implementation "androidx.lifecycle:lifecycle-runtime-ktx:2.6.1" set a lifecycle-aware task that can obtain window layout information from the windowlayoutinfo object. the checkandupdatestate() function is used to update your ui. pass the newlayoutinfo object to this function and make the appropriate changes. lifecyclescope.launch(dispatchers.main) { lifecycle.repeatonlifecycle(lifecycle.state.started) { windowinfotracker.getorcreate(this @mainactivity) .windowlayoutinfo(this @mainactivity) .collect { newlayoutinfo -> checkandupdatestate(newlayoutinfo) } } } the windowlayoutinfo object contains a displayfeatures object that contains the current state of your application. the displayfeatures object also contains the foldingfeature interface, which describes device states and orientations specific to foldable devices: if the displayfeatures object is null, the device is closed or folded. if foldingfeature.state is half_opened and foldingfeature.orientation is vertical, the device is a galaxy flip device in flex mode. if foldingfeature.state is half_opened and foldingfeature.orientation is horizontal, the device is a galaxy fold device in flex mode. if foldingfeature.state is flat, the device is fully open. for more information about adapting your ui for various device states and orientations, see make your app fold aware. to configure your application for the current window size, whether it is portrait, landscape, folded, unfolded, multi-window or any other mode, override the onconfigurationchanged() function and retrieve the current window size from the windowmetricscalculator interface. override fun onconfigurationchanged(newconfig: configuration) { super.onconfigurationchanged(newconfig) val windowmetrics = windowmetricscalculator.getorcreate() .computecurrentwindowmetrics(this@mainactivity) val bounds = windowmetrics.getbounds() ... } to learn more about implementing flex mode in an existing android project, see implement flex mode on a video player in code lab. adapt your application to multi-window mode multi-window mode and app continuity are key foldable device features that contribute to a more versatile and user-friendly experience. they enhance multitasking capabilities and overall usability of applications across multiple form factors. multi-window mode enables using two or more applications simultaneously on the same screen. applications can also be used in pop-up view, as a movable pop-up on top of another open application. to implement support for multi-window mode and app continuity: add the following attributes to the <activity> element in your androidmanifest.xml file: android:configchanges="screensize|smallestscreensize|screenlayout" android:resizeableactivity="true" adding these flags to the configchanges attribute accounts for every relevant device configuration change, creating a more foldable-aware ui. setting resizeableactivity to true allows the activity to be launched in split-screen and pop-up mode. to remember the application state before configuration changes, you need to use lifecycle-aware components: you can save your previous application state in a bundle inside the onsaveinstancestate() function as key-value pairs. you can then check if a saved application state exists inside the oncreate() function. if the savedinstancestate bundle is null, there is no saved application state. if it is not null, you can retrieve the previous application state and update the ui accordingly. for more information, see foldable adaptation essentials: app continuity and multi-window handling. alternatively, you can use viewmodels. a viewmodel allows you to store the application's ui state, and also provides access to its core business logic. to create a viewmodel, create a separate class that extends the viewmodel class and initialize mutablestate objects that can store the state. create an instance of your viewmodel in your activity's oncreate() function and use it to update your ui elements. for more information, see how to update your apps for foldable displays. if you do not want to support multi-window mode in your application, you can simply set resizeableactivity to false in the <activity> element inside the androidmanifest.xml file. conclusion this article describes some key practices for creating a responsive and adaptable ui for your application, such as enabling cover screen widgets and implementing flex mode and multi-window mode. following these practices ensures a versatile and user-friendly experience on foldable android devices, enhancing multitasking capabilities and the overall usability of applications across various form factors. related content foldables and large screens develop a widget for flex window implement flex mode on a video player galaxy z documentation
Jan 9, 2024
Samiul Hossain
Learn Code Lab
codelabcustomize styles of a watch face with watch face studio objective learn how to easily customize the style of a watch face using the customization editor in watch face studio know how to make your watch face respond to your step count using conditional lines overview watch face studio is a graphic authoring tool that enables you to create watch faces for wear os it offers a simple and intuitive way to add images and components, and to configure the watch movement without any coding watch face studio supports watch faces for the galaxy watch4 or newer version, and other watch devices that run on the same platform the style in watch face studio allows you to modify the style of the selected layer you can define image styles and theme color palettes that the user can choose from to customize their watch the customization editor in wfs enables you to refine, organize, name, and preview the styles you have defined for the watch conditional lines let you show and hide components on your watch face and control the behavior of the components use conditional lines to change the look of your watch face in response to certain conditions, such as the time and event, such as unread notifications set up your environment you will need the following watch face studio galaxy watch4 or newer smart watch running on wear os3 or higher sample project here is a sample project for this code lab download it and start your learning experience! style customization sample project 385 30 kb start your project download the sample project file, and click open project in watch face studio locate the downloaded sample project, and then click open add styles to the watch hands and index styles can be added for image-based layers such as icons, index, and hand component layers the watch face supports up to 10 image styles in the style tab, add a file of the same type as the existing image in the layer select the watch hand named min_hand click on the style tab click the + button to add more minute hands simply select the first three images repeat the previous steps for the hour hand and index create style sets go to the style tab and click on customization editor select min_hand, hour_hand, and index_line merge these by using right-click on your mouse change the name of the style sets using text id click on text id of the previously merged style set then, click the add new button set the id name to id_time and the default value to time then, click ok now, you can change the name of your style set click again on the text id and search for the id name that you just set here, the name and default value for your style set are changed finally, repeat the same thing for the background merge the other components to make a single background using the text id, change the name of the style set to id_background and the default value to background modify the theme color palette the theme color palette is a set of colors that you can use on a specific design component each palette can have up to three colors the watch face supports up to 30 theme colors here, you add a new set of colors for the month component go to the style tab under the theme color palette, click the + button choose a color from the palette you can change the other two colors by clicking on the color box preview your watch face using customization editor at this point, you have created custom styles for the watch hands and index, background, and color you can use the run pane and customization editor to preview how the watch face looks like when changing different styles open the run pane in a separate window and click customization editor in the customization editor, you can see the three tabs background, color, and time go to each tab and preview the different custom styles that you’ve created use conditional lines for step count use the conditional lines to make the step count on your watch face respond to certain conditions here, you want to show different images depending on the step count percentage 0% to 20%, 21% to 80%, and 81% to 100% click on the + button and select steps as a conditional line click on the steps icon to view the default conditional lines for all components select the sc_initial component then, double-click on the layer of its conditional line a warning prompts you that any existing conditional lines will be overwritten, simply click ok now, you can start changing the conditional line for the sc_initial component drag the start of the bar to 0% and drag the end of the bar to 20% this sets the condition that the sc_initial image visually appears on the watch face when the step percentage is from 0% to 20% for the sc_moderate component, set the conditional line from 21% to 80%, and for the sc_good component, set it from 81% to 100% this makes the sc_moderate and sc_good images to visually appear on the mentioned step count percentages tipread apply conditional lines on watch faces to know more about conditional line test the watch face in the run pane, click the preview watch face icon and you can run the created watch face with different custom designs to test your watch face, you need to connect a watch device to the same network as your computer in watch face studio, select run on device select the connected watch device you want to test with if your device is not detected automatically, click scan devices to detect your device or enter its ip address manually by clicking the + button noteto run the watch face successfully, you may need to configure its always-on display to avoid any error when you run it on your watch for more details about testing your watch faces, click here you're done! congratulations! you have successfully achieved the goal of this code lab now, you can create a watch face using style and conditional lines in watch face studio by yourself! if you face any trouble, you may download this file style customization complete project 425 73 kb to learn more about watch face studio, visit developer samsung com/watch-face-studio
Learn Code Lab
codelabapply conditional lines on watch faces objective learn how to create a watch face that responds to time and notification using conditional lines in watch face studio overview watch face studio is a graphic authoring tool that enables you to create and design watch faces for watches running on wear os it offers a simple and intuitive way to add images and components, and to configure the watch movement without any coding watch face studio supports watch faces for the galaxy watch4 or newer version, and other watch devices that run on the same platform conditional lines in watch face studio lets users easily control components and its behaviors on watch faces you can make your watch faces change dynamically based on time, step-count, or battery using conditional lines now, with the latest version of watch face studio, you can also set conditional lines based from events such as low battery, unread notification, or scenarios without any events set up your environment you will need the following watch face studio latest version galaxy watch4 or newer any supported wear os watch sample project here is a sample project for this code lab download it and start your learning experience! conditional lines sample project 2 43 mb start your project download the sample project file, and click open project in watch face studio locate the downloaded file, then click open apply conditional lines based on time using conditional lines in watch face studio, your watch face can visually change its design based on the time of the day here, you can change the background image of the watch face based on certain time intervals by setting the timeline condition of two background images, named background_day and background_night click the show/hide timeline icon to show the frame area notice that there's a default conditional line based on time for every component each conditional line is represented by a bar in the timeline and you can adjust it using the slider at the end of the bar collapse the background group containing the two background images in the timeline area of background_day, click on its bar and hover your mouse at the start of the bar drag the start of the bar to 06 00 00h and drag the end of the bar to 18 00 00h this sets the condition that background_day visually appears on the watch face from 6 00 am until 6 00 pm tipto quickly navigate on the timeline, hold shift + mouse scroll learn more about keyboard shortcuts, by referring to this guide next, for background_night, set the first time condition from 00 00 00h to 06 00 00h by dragging its bar respectively at the start of 18 00 00h, double-click at the timeline area to create a 2nd bar at that specific time drag the end of the bar to 00 00 00h, at the rightmost part of the timeline this makes background_night to appear conditionally from 6 00 pm until 6 00am on your watch face now, it's time to check if your watch face responds correctly based on the time of the day click show/hide run button to open the run panel in watch face studio move the time control slider from 06 00 00h to 18 00 00h and the watch face should show background_day as its background similarly, check if the watch face changes its background to background_night when the time is from 18 00 00h to 06 00 00h set the unread notification event make your watch face change dynamically based on a specific device event in this step, add an event for unread notifications on notification_envelope, an animation component included in the sample project click + or add conditional line and select event noteto remove conditional line icons such as the battery, steps, 12h/ 24h, or event, simply right-click on it and select remove click show/hide event to start configuring the event-based conditional line for notification_envelope on the notification_envelope layer, double-click on the event frame in the warning window, click ok in this case, all existing conditional frames for this layer are deleted afterward, a bar is created for the event-based conditional line noteeach layer responds to only one type of condition next, drag the bar from no event to unread notification noteno event is used if there is no condition set on either battery low or unread notification events check if the notification_envelope animation component appears on your watch face whenever there's an unread notification click play run preview and move the unread notification slider in the run panel when the unread notification is set to a value of 1 or more, the animation component should visually appear on your watch face test the watch face to test your watch face, you need to connect a watch device to the same network as your computer in watch face studio, select run on device select the connected watch device you want to test with if your device is not detected automatically, click scan devices to detect your device or enter its ip address manually by clicking the + button notethe always-on display is already set in the project to run the watch face successfully, you may need to configure its always-on display to avoid any error when you run on device you're done! congratulations! you have successfully achieved the goal of this code lab now, you can use conditional lines in watch face studio, all by yourself! if you're having trouble, you may download this file conditional lines complete project 2 43 mb to learn more about watch face studio, visit developer samsung com/watch-face-studio
Preferences Submitted
You have successfully updated your cookie preferences.