Hi, I've recently got this phone. I noticed that direct taps on the curved area are not working (e.g. tapping the backspace button on keyboard). However when I swipe through the curved area, it seems to work.
Is this normal with this device or just an issue with my unit? This is my first curved screen phone.
Thanks.
(Not using tempered glass protector)
Seems to be intended behaviour as a way of stopping accidental touches, swiping is different as it's mostly for gestures, swiping from the right to open drawers etc.
Related
On my Vibrant I'm using a Ram Aquabox that has a rubber membrane that you touch the screen through. It works perfectly on the screen area but does absolutely nothing on the system buttons. This is a problem since Home and Back are very important for navigation.
So how is the touch on the system buttons different than the screen? I've seen videos about grounding being an issue if you don't hold the phone with the other hand so is there some way I could ground the phone or would plugging in the charger act as a ground?
Alternatively, is there an application that puts the system keys on the screen instead?
Hi,
I have recently had a problem with the touchscreen. If you were to swipe your finger a lot over the screen (for example playing fruit ninja) you would be unable to use any of the bottom buttons (like the home button)
I did a little investigation work with the app 'touch test' and I have found what is happening. If you did the swiping action the first finger touch (finger one represented on blue by the app) would lock. If you were to swipe your finger again across the screen it wouldn't registered as the first finger touch, but the second. The bottom bar navigation buttons only respond to the first finger touch.
Sorry if that was confusing in short: The tf screen thinks a finger is touching the screen when it is not, and any finger that touches the screen after that would be represented as another multitouch.
As far as I see there is only one solution: Hit the power button to make it sleep and then wake it up.
I have noticed that their have been several other threads on this, none of the other users had any luck. Wiping the device doesn't help.
Has anyone else noticed this.
Thanks
Have you tried an anti-fingerprint screen protector? I'm wondering if skin grease is possibly the cause of this (I don't experience any such problems, although admittedly I don't play fruit ninja et al, and have a screen protector on).
grainysand said:
Have you tried an anti-fingerprint screen protector? I'm wondering if skin grease is possibly the cause of this (I don't experience any such problems, although admittedly I don't play fruit ninja et al, and have a screen protector on).
Click to expand...
Click to collapse
I thought of that for a while. But when you put it into sleep and wake it up it's perfectly fine. Maybe doing that undoes all of the inputs. I'll try cleaning up the screen. I admittedly do have sweat hands
I've not been using the S-Pen the way i thought i would. The reason for it is because i often have problems that my palm accidently touched the notification bar or that my hand triggers a gesture set in GMD Gesturecontrol.
So what i'm looking for is a way for the device to register if the s-pen is near the screen and then deactivate gestures or even go full screen so i can't hit the notificationbar.
Has anyone found a App that can do something like that.
I know there are Apps that give a trigger for when the s-pen is detached, but i'm often holding the pen and still use my fingers for pressing buttons, so i'm not looking for that.
S.Phrenic said:
I've not been using the S-Pen the way i thought i would. The reason for it is because i often have problems that my palm accidently touched the notification bar or that my hand triggers a gesture set in GMD Gesturecontrol.
So what i'm looking for is a way for the device to register if the s-pen is near the screen and then deactivate gestures or even go full screen so i can't hit the notificationbar.
Has anyone found a App that can do something like that.
I know there are Apps that give a trigger for when the s-pen is detached, but i'm often holding the pen and still use my fingers for pressing buttons, so i'm not looking for that.
Click to expand...
Click to collapse
i thought that the Note 10.1 itself disable finger touchs when the s-pen is near the screen, isn't it @@
I thought so too, however, when holding the pen near the screen gestures still work.
I accidentally dropped my OnePlus One and it resulted in a crack that goes from the bottom of the left edge of screen to the right edge of the screen about 3/4 inches from the bottom. Luckily the LCD display still works, but the digitizer (touch) has stopped working in a rectangular area at the bottom of the screen. I now have to use a screen rotation app, combined with on-screen navigation bar (the hardware buttons also are IN that rectangular area) to work with my OPO along with a lot of turning the phone upside down to access all the screen functions.
Today I got to know of the wm size and wm density commands using which I was able to play with the phone's resolution, but whatever I do with these commands, the whole screen area is being used. Would anyone here know of some way to change the usable screen area so the rectangular area at the bottom is ignored for all purposes - i.e. it remains black all the time and everything begins from above this area?
I do have a replacement screen but I don't suppose it is original plus I don't want to risk opening up my phone and putting the replacement.
I'm using CM12 and I've rooted my phone.
Split screen from full screen issue
Hi, if anyone's using split screen 2 finger gesture or One Hand Operation + gestures for split screen, did you notice that since OneUi 5.0 it mostly opens split screen apps in the top screen when going from a full screen. For example when I'm watching YouTube in full screen with a gesture it opens it on the bottom side mostly even tho it's inconvenient and not handy, it sometimes opens it in the top part of the screen if I rotate phone in another side. For example it opens it in the top side of the screen if the phone is in full screen - front camera is on the left side, but it opens it on the bottom if the front camera is on the right side. I don't see the logic.