The key word is 'few'.
Congratulations you are no longer the target audience for the OS. Instead MS have decided to setup an OS experiance for the people who do want a tablet to do those things, or do want to connect with their XBox and who don't want to worry about where their data is living.
Even from a developer standpoint, I could see touch having a serious impact. You can do a lot of things on a tablet very naturally and quickly than you can with a mouse and keyboard. On my current convertible tablet with windows 7 I can still do some things that go very rapidly compared to when I try to do them elsewhere. Moving/resizing windows and zooming are decent examples. I actually under-utilize the features because I'm so used to kb/m I often forget I also have a touch screen. It's miles better than a touchpad that's for sure.
I've been thinking about trying to make an interface with the kinect to treat monitors as touch screens without you having to be within arms length of the monitors, and something like that I think could be really cool. I've been too busy on it, but you just need to do eye tracking and have the user calibrate it so it knows where the screens are. Then just project the screens onto a plane that's about arms length away from the user and you can see exactly what they are pointing at and when they tap it.
Stuff like this is the next step in NUI imo. Being able to point and say, "What's that?" etc. is an awesome possibility.