Q: Displaying Coords for use with adb shell input. Yes i know about dev opts. - General Questions and Answers

Hi all.
OK so i will try and keep this as concise as possible.
I want to display coordinates live WITHOUT having to press the screen. I am trying to automate a game using adb shell input tap/swipe etc. That is all going fine, but the question is how do i get coords on screen without accidently moving the screen or having to long press with the dev options grey bar.
If i tap, the coords appear and disapear too quickly to read, but if i long tap it moves the screen a little and opens a menu in the game, which again moves the screen.
So possible solutions i can see, but dont know how to do/if possible...
1. make dev options tab display x/y coords after screen is released.
2. using s-pen (im on galaxy note 3) hover function to display coordinates lives (best and preferred)
3. some sort of overlay app to stop screen moving but still make background visible ..
anyone any other ideas, or know how to accomplish one of the above? preferably 2 or 3, or both.

Related

How about a back gesture?

I appologize if its already on here somewhere but I didn't find much on it. what I think would make the G1 much nicer to use would be a back gesture rather then having to reach down to the back button all the time. u know slide your thumb from right to the left when ur in an app and it would be the same as streching your thumb all the way down to the back button. If someone could either tell me how to set gestures or make something quick and simple to do it I think it would be much nicer to use and I would appriciate it much.
This forum is great and helped me tons on my universal so naturally I'm turning here for help first. I have a feeling by the end of the year Android will be killin the iphone! Turns out its not the phone, its the OS... lol..
The G1 does that to a certain extent but what you are talking about would require a lot of codding in the OS, I don`t think it would be a simple task. But if you are a coder and want to do this, I`m sure other people can offer you some assitance, JF would be one of them for sure. Cheers!
svxdriver said:
u know slide your thumb from right to the left when ur in an app and it would be the same as streching your thumb all the way down to the back button.
Click to expand...
Click to collapse
Possibly the only space the operating system reserves for its own input detection is the notification bar. I imagine it might be possible to add some kind of swipe detection to that. However considering how narrow that space is, I don't see how it would be any better than just clicking the back button.
As for the system detecting swipes on the whole lcd, that's probably a non-starter. It would break the touch user interface of many apps.
Currently there are two ways to bring up the programs tab - dragging the tab up and a short press on the tab itself. It *may* be possible to change the function of the short press to 'back' but I think that may cause a lot of inadvertent 'back' presses...
boogie1077 said:
Currently there are two ways to bring up the programs tab - dragging the tab up and a short press on the tab itself. It *may* be possible to change the function of the short press to 'back' but I think that may cause a lot of inadvertent 'back' presses...
Click to expand...
Click to collapse
I think what svxdriver wants is for a leftward swipe to be universally recognized as a backstep, meaning not just on the Launcher screen but in all applications.
Not sure how well that would work. The o/s would need to distinguish between a swipe and just scrolling across the screen. Easier, I think, to just use the back button.
BobbyHill said:
Not sure how well that would work. The o/s would need to distinguish between a swipe and just scrolling across the screen. Easier, I think, to just use the back button.
Click to expand...
Click to collapse
well it doesn't have a hard time distinguishing between moving up on a page and when you drag from the top of the screen to pull the notification bar down. in windows mobile I had a program that I could assign different gestures to do different things.. I thought it the same kind of thing would be handy.. but I wouldn't want it to go back cause I swiped, onlyl if I swipe from the edge of th screen to signify I want to go back..

How to scroll through lists without triggering items

>>>APOLOGIES IF THIS HAS ALREADY BEEN COVERED - but **I** can't find anything definitive on it for the HD2<<<
I used to use AEButtonPlus on my HD to scroll up and down in lists etc, so to avoid opening each item etc - does anyone know if either AEButtonPlus is compatible [not mentioned in the Wiki] or if there's a way to do this (scroll without opening each item)?
Many thanks in advance
It just takes a little practice. Make sure you don't tap the screen (tapping = a short touch in one place). To stop a scrolling list, put your finger down and keep it down. If you slightly drag the screen instead of keeping your finger in one place, this avoids accidental selection as well.
Also, many screens still have a hidden scroll bar on the right.
Don't know if AUButtonPlus works on the HD2.
AEButtonPlus works and you can map the volume, Start and OK/back buttons.
Brilliant - thanks guys

[Q] Active apps and buttons on touch down events

I'm trying to get my father to use a spare android phone that I have. I he's a senior so accessibility is a big concern. I've used a launcher to make the text and icon size pretty big.
But now he's having trouble with simple taps. When he tries to hit an icon or button, he accidentally does a small swipe of the icon causing it not to load.
I've realized no apps or links run on the initial touch down event but on the release (as long as there isn't a swipe or hold). Is there a way to change this so that the touch down event activates links?
Any help is appreciated.

one-handed mode activity?

I'm trying to write a Tasker task that would, among other things, trigger one-handed mode on my S8+. Simulating the triple home button press doesn't work as it's too slow, and I don't want to use the swipe as it stupidly resets the size of the reduced screen to default (I prefer it a bit bigger). So now I'm trying to figure out what the activity is that launches when you put the phone in one-handed and trigger that directly. It does not appear to be something you can adjust through settings as I took a list of settings from adb before and after triggering the mode change and nothing was different. Any thoughts on locating this?
You can resize that window as well. Swipe from the edge of the screen as you normally do, then swipe towards the bottom of the screen without lifting your finger up.
BurakSsl said:
You can resize that window as well. Swipe from the edge of the screen as you normally do, then swipe towards the bottom of the screen without lifting your finger up.
Click to expand...
Click to collapse
Thanks, that might be useful. But can Tasker simulate that input of swiping up and then down without lifting?
Update: logcat shows a few interesting messages when one-handed mode is started. The first output is "SamsungPhoneWindowManager: toggleEasyOneHand() enabled=1". Then there's some stuff that seems to control the screen shrinking animation and set up the background for the non-usable screen space, and then there's a vibration call from a package called "com.sec.android.easyonehand".
I figure my best shot at doing what I'm trying to do is to either find a way to send that toggle command to the window manager directly or figure out the right activity to start in that system package. Unfortunately it looks like inspecting the package manifest is a no-go. Anyone have any thoughts on the next step?
Edit: was able to get the manifest after all. There's an intent filter action called "com.samsung.action.EASYONEHAND_SERVICE". I've tried calling that with an am start command in terminal emulator and I get "unable to resolve intent".
So you could not find a proper solution? Im interested in this too

[GUIDE] Using Gestures to easily navigate the P2 with just one hand (requires root)

Even though the P2 can be considered large with a 5.5" display, the lower region of the screen is occupied by the on-screen navigation bar. While there's an option to turn off the navigation bar and use the fingerprint button exclusively, I still find this alternative lacking in many fronts:
1. I can't tell you how many times I return to the home daily; it gets tiring to always have to push down the fingerprint button.
2. You need to touch the fingerprint button for a second or so before the recent apps window is launched. If you've ever had the need to flip between apps quickly, you would know how irritating this delay is.
3. Because one button is used for three functions (back, home, recent apps), there's no way to quickly switch between apps (like alt+tab in windows).
4. I sometimes find myself lying on my side, holding the phone with one hand, and finding it almost impossible to reach all the way down to the navigation bar, let alone the fingerprint button.
Fortunately, I have a solution that I believe addresses all of the problems above and that is to navigate the phone through the use of gestures.
I've been using GMD GestureControl for about a week now and I think I've found an optimal combination of gestures that don't only work but are also intuitive to use. There's nothing you can lose by giving it a try.
It can take a bit of time to set it up the way I have mine set up. So, if you don't feel like spending some time doing it yourself, you can just "restore" my "titanium backup" backup. Download it from here. Just extract the zip file and copy the content into your "titanium backup" folder. Open "titanium backup" and restore.
If instead, you would like to set it up yourself, here's how I currently have mine set up:
1. After installing the app, the first thing to do is go to "Device Setup" by pressing the menu button on the top right corner of the screen. Change your values to the one you see in the first attached image below. Then go to "Settings", and change the values to those in the second attached image.
2. To create a gesture: press the plus button > Select a gesture > create path. You can see all the gestures I created in the third attached image. Recreate all of them.
3. You can decide which region of the screen a gesture is detected by choosing the appropriate zone in the "select starting zone" option.
4. You need to go to "advanced options" (attached image no. 4) at the bottom of the screen to decide whether or not the gesture can be triggered when the keyboard is active. I have most of the gestures disabled when the keyboard is active because I sometimes swipe to type and I don't want to accidental swipe to trigger something. For this reason, I have duplicated most of the gestures for the only the top half of the screen (above the keyboard) so that the gestures still continue to work when the keyboard is activated.
5. Make sure you enable "evaluate on release" on all gestures so that a gesture is only triggered after you lift your finger off the screen.
If you've gone through all the steps, it's now time to start trying things:
1. To open resent apps, just make a small ^ sign anywhere on the screen.
2. To go home, just make a small V sign anywhere on the screen.
3. To flip between apps, just make a small < or > sign anywhere on the screen, starting from the top.
4. To press back, swipe away from left or the right edge.
5. To open the launchpad, swipe up on the left or right edge.
Good looking out. I really like this option.

Categories

Resources