I was showing the Tablet PC to a documentation guru I know and she was quite impressed. I handed her OneNote and she caught on faster than I’ve seen anyone else. I think it it’s in large part due to the excellent enhancements in the OneNote Service Pack 1 beta I’m running. For the most part she was able to convert her handwriting to text without a hickup. Interestingly, when the recognition was off the mark, she immediately tried to use ink gestures to fix the text.
Although I showed her how to make the corrections in OneNote, I also brought up my ink editing prototype app. This is what she wanted. She suggested that I try to figure out how to integrate it into Word. Hmm. I have no idea if it’s possible to query for what text is within certain pixel regions or not in a Word document. For that matter I don’t know if I can get to the ink or if it’s possible to lay another ink layer on top of the window for editing. Either way, I assume once I know the text position I can perform the editing operations 🙂
Sometimes it’s knowing these integration issues that can make all the difference.
As someone who works with words and manuscripts all day every day, I strongly agree : ink editing would be a stunning development. Imagine a marked-up manuscript where the recipient could then go through and simply approve or disapprove of each inked edit!
I agree. The more I use a Tablet, the more I believe that a few conventional paper metaphors will work well in the Tablet kingdom.