How does Palm Rejection really work on this device? - Galaxy Note Pro 12.2 Q&A, Help & Troubleshooting

Hello, I am trying to get familiar with this device and those S-pen enabled apps. How does palm rejection really work? It seems that if the pen touches the screen first and then I rest my palm on the screen, there is no marks made by my palm. If I put my palm on it first, depending on the settings, sometimes my palm leaves some marks.
While using the handwriting feature, I often hit the large space bar by mistake. Any way to avoid this while resting the writing hand on the screen?

There's subtle nuances in terms of performance of rejection of stray touches depending on the application that you're in. Some apps handle it better than others.
For example, in Action Memo as you lay hour hand down to start writing with the stylus it may leave a stray mark. Experiment with this by having the first touch of your hand be your knuckle of your pinky. Drag your knuckle across a little before bringing the tip of the pen to the screen. That stray mark stays there when you're done with your writing with the stylus.
Now repeat the same test in S-Note with finger input enabled. Again practice the motion of keeping your knuckle on the screen and dragging, then bringing the pen to the screen. Notice anything different? As long as you haven't lifted your knuckle, S-Note deletes the stray line the moment the pen gets close. Any marks you've made prior to the pen getting close to the screen stay there however.
The point being that the answer isn't as straighforward as you might think. Here we have two examples of two applications made by the same developer (at least you would THINK its the same developer) yet they act completely different. When writing with these devices one has to be deliberate in when and how they bring their hand and pen to the screen. With practice this becomes second nature though. It definitely helps when note taking apps have the ability to ignore finger input.
With regards to your problem with the handwriting recognition pad used for text input . . I'm with you there. The location of that space bar and all the other buttons is mindbogglingly stupid. They should be located above your palm. IMO what we have here is a classic example of the porting of a function that was developed for phones held in your hand (whereby you do not need to rest your hand on the phone) to a tablet without realizing that the usage of the function would be different on the new hardware.

Hi... using styli on tablets is new for me, and some general guidance would be appreciated.... I'm looking for palm rejection solutions for Samsung Tab devices that (unlike touchscreentune) don't require rooting.
We have some of these Notier styli in-house, and certainly they provide a very nice writing experience, except of course that S Note doesn't have palm rejection so the stylus can't be used for note taking.
A Microsoft Surface 3 will arrive later today, and it has a resistive screen, Wacom stylus and palm rejection, so that should work well. But we'd like to use cheaper Tab devices as well.
Our applications are general note taking (instead of legal pads) and also annotating medical images.

Just my opinion here but the perception that palm rejection is not present is not a black and white thing. Rejection of stray input has more to do with touch sensor type of the device, the application used and the way the device is used within the application in question as opposed to a device itself not having palm rejection support.
Take a capacitive sensor based screen for example, where the user holding a capacitive stylus in hand and he/she brings the hand down to rest on the surface to begin writing. For a brief moment some other part or parts of the hand/wrist are going to contact the screen prior to the tip of the capacitive stylus. Without any other means of knowing how to interpret these inputs the software is going to have to consider registering them somehow. As long as these points of contact don't move significantly before movement of the stylus tip begins the application that is active can then make sense of what is going on and begin to reject the touch inputs from everything but the stylus tip. This is how "palm rejection" works. All touch input has to be evaluated and then the application decides what is input and what isnt.
IMO devices with active stylus support are always going to have an advantage when it comes to "palm rejection" in that software applications can be written in such a way as to completely ignore capacitive touch when the pen is in range of the screen. LectureNotes app for example can be set to completely ignore finger touches for writing operations and I'm sure this is not the only application that can do this. That isn't to say that this is a global feature that ALL apps inherently have, but rather it is a feature available to developers based upon how they implement things. Devices limited to capacitive stylus support only will always be at a disadvantage because the device will not perceive a difference between the tip of a stylus and a finger.

Aloha...
Yes, that's right... the application needs to work with the touch screen driver to reject inputs that aren't useful.
With the Samsung Tab Pro 12.2, resting one's palm on the screen completely disables the ability to write with a stylus (using S-Note), so it's pretty much hopeless, at least using S-Note. S-Note is nicely integrated with Evernote...
Will give LectureNotes a try. It mentions being "usable" with Samsung Tab products, so let's see if it can reject palm pressure.

Palm Rejection just means you don't smudge your drawing/writing with your palm whilst resting it on the screen.
Remember how you used to get a black palm from the ink as a kid, and your whole paper was covered in smudges? That.
It is not a 'Disable Touch Input' feature. It does not disable touch, it does not disable the buttons, and it does not restrict input to the Pen only.
If you're rooted, this is an option: https://play.google.com/store/apps/details?id=com.gmd.spencontrol

Related

Software idea for increasing touch usability

Something along these lines may have already been created, but I havent been able to find it. Shell software, and touch flo is all very good for using our phones without a stylus, but we all know that at some point we are going to have to use a piece of wm software, and out the stylus comes.
What I am proposing is a virtual mouse cursor. I came across Innovisoft Virtuamouse, which is controled by the d-pad, but why not have a cursor about 50*50 pixels, which can be moved by finger.
In my paint mockup picture, the red circle would be where you touch to drag the cursor, and the tip is the active point where the stylus would tap. It would be moved by dragging, and a stylus tap would be signified by removing and replacing the finger within 200ms, like a laptop touchpad.
This cursor would probably be turned on and off by a hardware button.
Unfortunately I am not a developer, so this would have to be a project for someone else, but I'm sure people would be willing to contribute.
Since the Diamond having multi-touch seems to be comfirmed (link below), I would bet that one of the first things we see from deevelopers is just this idea. Using the touch-sensitive part of the area around the action button as a trackpad and clicking with a button press.
Diamond multi-touch vid: http://www.youtube.com/watch?v=f3Owgcos_KY&feature=related
Seems like a good idea, but I do have to question this a bit. You want to use a touch controlled mouse to tap a button? It seems like a bit more work to do. Also, your picture is a smartphone....
The idea is to use the mouse to press the odd, fiddely button, not for constant use.
As for that being a smartphone screen, that was an overlook on my part - I just grabbed the first screen capture off google images.
I think it would be useful for web browsing.
Surur

Touch Cover Sensitivity

Has anyone found a way or believe it is possible to control how sensitive each button on the touch cover is? I presume the software just sees it as a giant flat surface, and then maps presses in specific areas to button presses. So I guess really you would be saying to the software that touches from this specific area of the cover we are going to accept at a lower pressure.
What I want is to be able to increase the sensitivity of the spacebar area. I feel like when typing from my lapt or any other non solid surface that I really have to hit the spacebar for it to register, where as the rest of the buttons feel spot on.
Thoughts?
tiny17 said:
Has anyone found a way or believe it is possible to control how sensitive each button on the touch cover is? I presume the software just sees it as a giant flat surface, and then maps presses in specific areas to button presses. So I guess really you would be saying to the software that touches from this specific area of the cover we are going to accept at a lower pressure.
What I want is to be able to increase the sensitivity of the spacebar area. I feel like when typing from my lapt or any other non solid surface that I really have to hit the spacebar for it to register, where as the rest of the buttons feel spot on.
Thoughts?
Click to expand...
Click to collapse
It's funny you mention that, because I have more or less the same issue, but with the Shift keys. I always miss the Shift key and hit either below it or I don't hit it hard enough. I want to prevent that from happening because it can get kind of annoying. Great question.
Your assumption is not actually true, by the way - different parts of the cover have different sensors, or no sensor, under them.
I've found hitting the spacebar with my left thumb way more reliable than with my right. Not sure why, and since this isn't my default style, it currently slows down my typing. Also, if I remember to hit further upward on the bar, it works better. It's sensitive all the way down to the top of the trackpad, but it's either more sensitive closer to the other keys, or something about my hand geometry makes me naturally hit it better when I hit further up.
I always found the sensitivity to be a bit low for my taste and I can't get over this particular sensitivity level still. Since it's software that interprets presses, there could be a registry key for that. Microsoft should make it a setting.
Anyway, I'm also starting to experience a bigger problem - certain keys' sensitivity degrading. Now it takes considerably more force for such often used keys as A and S to register compared to the rarely used ones. While it takes an okay amount of pressure for a "devices" button to register a click, I have to stab the A for it to register. And I think it will get worse.
P.S.: I type quite a lot though wouldn't go for the type cover because touch cover is more aesthetically pleasing and good enough as a keyboard if worked correctly, up to it's potential.
Also, I think most people would appreciate autocorrect on the touch cover.

Drop down menu with touch on surface, problems

Aloha all, Having trouble knowing how to ask this...bear with me.
I am having problems with the menus that are triggered by the 'hover' action of a mouse. Using the surface pro or even my lumia 920 I am frequently unable to operate many menus on sites.
officefootballpools.com tournamentpools.com and a host of other sites have the same type menus.
Normal action would be with a mouse and when you hover it drops down a menu that you click your selection. Its isn't a normal drop down but a .li. html type for the menu.
Im sure this has been discussed but I was looking for some vocabulary so I can search the right threads.
You cant interact with hover images via capacitative touch. The browser has no way to tell if you want to click or just show the drop down, it assumes click. Some sites (with limited success) I have managed to press and hold on a link to show its drop down, then if you ignore the usual right click popup then you can sometimes hit the correct icon, this is on a lumia 710.
Otherwise if you have a device with an active stylus (the surface pro for example) you can get rollover easily. If you notice when you hold the pen a tiny bit off of the screen you get a little dot appearing on the screen where the pen is pointing without any physical contact. This dot can trigger rollover events as in this case windows knows that you are not touching the screen, touch becomes a click, hovering the pen over the screen becomes mouse movement which is enough to trigger rollover.
Thank you for your reply. I understand and assumed the same. I have been working with an x61t for a few years so the jump to capacitive has thrown me a bit.
Okay, so what are web developers doing instead of this type of drop down? I recently read an article that says a lot of developers are moving to the new msn.com type format with nav bars on left or right with no hover...
Anyway thanks again. The volume of sites with this issue is pretty large. Yahoo.com/fantasy chokes a donkey, etc.
Side question, does Win8 have the on screen mouse that I used to have on my Lenovo? Or is that a Lenovo product probably?
Alot of web developers are doing nothing at all. In some cases clicking the link that causes the dropdown redirects to a page listing the other links in that dropdown, that's always handy. Usually most devs create mobile versions of sites which are normally touch friendly. I have seen 1 or 2 sites create iPhone versions before, these worked nicely on android so I would assume they are fine on the lumia and maybe the surface.
I haven't ever seen an on screen mouse before but if there is not one in windows 8 then there may be a 3rd party one somewhere. I am on my phone right now, otherwise I would have looked myself.
One advantage of using "Mobile" websites is that they should be designed with the limitation of touchscreens - specifically, the inability to track hover - in mind. That may help you out.
Alternatively, the Surface Pro uses an active Wacom digitizer (as well as a touchscreen) that can sense the pen at a distance. You can use the stylus pretty much perfectly as a mouse, with hover and right-click and everything.

Throw fruit, turn Kindle pages with eyes (5 dollar eye-tracking). Android SDK

**Integrating an eye tracker into the hardware**
“The Eye Tribe released its first eye-tracking product to developers in December -- a long, thin $99 module that attaches to a Windows laptop, computer or tablet. It sold out immediately and the company is now working on a second batch. But it also has a more exciting proposition in the pipeline -- a software development kit module for Android phones that it eventually wants to see integrated into the a wide range of mobile devices.
“Most of the requisite hardware is already built into phones. The Eye Tribe just needs to persuade companies to integrate the technology.
All that's required is a camera sensor with infrared capabilities. "What we know is that in Q4 this year, sensors are coming out that can switch between regular camera and infrared camera."”
Click to expand...
Click to collapse
wired/co/uk/news/archive/2014-02/25/eye-tribe-android
**Cost**
“OEM vendors could likely add this sensor to their handsets for just five dollars”
Click to expand...
Click to collapse
reviewscnet/eye-tribe-shows-off-working-eye-tracking-on-a-mobile-phone/
If modifying the device to add eye-tracking only adds 5 dollars to the manufacturing cost, then I’m sure that at least one of the smartphone, tablet, notebook, and laptop manufacturers will make the supposedly easy camera modification.
**See before touch**
I think that most of the time, a person will see a widget that they want to touch before they actually reach out, and physically touch it
(The only times where I’m not looking is when I press the Android Navigation Bar buttons that are near the bottom edge of the screen. Although, on a larger Nexus 10, I usually have to look at them first).
**Eyes + consecutively touching the same few buttons**
On certain tasks, it might be convenient and fast to have the option of touching “single tap where I’m looking”, and “swipe up where I’m looking” buttons. You would only need one or two buttons that are close to you (kind of like the Navigation Bar buttons at the bottom).
Look, touch an easy-to-reach spot, look, and then touch the same button again. You don’t have to keep changing your hand and finger positions between each tap.
“Looking at icons on a desktop instantly highlights them, and you can then tap anywhere on the screen to open up the selected app.”
Click to expand...
Click to collapse
stuff/tv/mwc-2014/eyes-eye-tribe-we-play-fruit-ninja-using-nothing-our-eyeballs/feature
I guess that in one of their demos, they temporarily made the entire screen a “tap where I’m looking” button.
Besides the three default buttons in the Navigation Bar, you could add “single tap where I’m looking”, and “swipe up where I’m looking” (to perhaps simulate a Page Down for reading) buttons, and those alone should allow you to do a lot of things).
Vertical touchscreen
If you have a vertically propped up tablet with an external keyboard, you could remap a keyboard button to be the “tap where I’m looking” button.
**Hands-free interaction**
Even without the above options, I still think the ability to have a page automatically scroll down when your eyes reach the bottom of the page, or have an e-book automatically turn the page when the gaze reaches the corner of the text would be pretty good features to have. They would be especially handy for computer interaction while cooking and eating, and interacting with a vertically set up touch device, or laptop that is more than an arms-length away while you do other stuff on the desktop.
(**Google eye tracking patents**
Notably, Google has an eye tracking patent that involves recording advertisement impressions through eye focus with pay-per-gaze, and another patent that demonstrates a method to unlock a device by having a sensor in a head-mounted accessory (probably something like Google Glass) track the patterns of the pupil.
It indicates that eye tracking could have even more backing in the near future).

Question Lenovo P11 tablet, cannot draw curved lines when starting to draw

Hello,
I've recently bought a Lenovo P11 tablet to take notes. When drawing with a pen, however, I have this problem (and I have verified it by enabling touch detector in developer options): when I lay the pen/finger on the screen, there is an area (I would say 2-3 cm) where the cursor doesn't move even if I move the pen. When I exit that area, a line gets drawn between where I started and where I arrived, but this is wrong as I may have drawn curves in the middle of the two.
This leads to two things:
* When I want to draw something rounded (like a "2"), the upper part gets drawn as a line unless I draw it very big.
* When I want to draw something small, only a dot gets drawn because I didn't leave the "area" and therefore didn't trigger the cursor.
Now I am pretty sure this is a software problem, as if it were a problem with the screen this wouldn't happen only at the start of the line.
Also, I'm sure this is not because of the application I'm using (OneNote), as I've tested with the developer options to show the cursor and in any place in the OS, when I start drawing, the cursor is stuck at where I've begun.
Anyone can help?
Thanks.
Nexgan said:
Hello,
I've recently bought a Lenovo P11 tablet to take notes. When drawing with a pen, however, I have this problem (and I have verified it by enabling touch detector in developer options): when I lay the pen/finger on the screen, there is an area (I would say 2-3 cm) where the cursor doesn't move even if I move the pen. When I exit that area, a line gets drawn between where I started and where I arrived, but this is wrong as I may have drawn curves in the middle of the two.
This leads to two things:
* When I want to draw something rounded (like a "2"), the upper part gets drawn as a line unless I draw it very big.
* When I want to draw something small, only a dot gets drawn because I didn't leave the "area" and therefore didn't trigger the cursor.
Now I am pretty sure this is a software problem, as if it were a problem with the screen this wouldn't happen only at the start of the line.
Also, I'm sure this is not because of the application I'm using (OneNote), as I've tested with the developer options to show the cursor and in any place in the OS, when I start drawing, the cursor is stuck at where I've begun.
Anyone can help?
Thanks.
Click to expand...
Click to collapse
was it a solution?

Categories

Resources