Select Page

Simultaneous Document Development and Production

One major flaw I’ve found with many documentation packages is the fact that you cannot work on your source documents while you are building your documents. Let’s take Web Works Publisher 2003 (WWP2003) for example. It’s one of the better Help Application Tools (HATs) out there, and due to the early teething problems exhibited by its replacement (ePublisher 9.x), I’m still not ready to migrate from WWP2003 to its more recent replacement. For those of you that don’t know, WWP2003 integrates with FrameMaker or (*gasp*) Microsoft Word. To use it, you define your output format (WebHelp, CHM, JavaHelp, etc…) and map all your source FrameMaker styles to your WWP2003 output styles.

When it’s time to build your project, you simply press the build button in WWP2003, and it churns away for an inexorably long period of time while building your project. Unfortunately, while it is churning away, you can’t continue to work in FrameMaker. That means lots of down-time while you are waiting for projects to build. If you have a number of projects to build, it can easily mean that you are stuck for possibly several days just building your projects. What’s worse is because Quadralay wants to sell you additional applications, you can’t tell WWP2003 to build several of your projects. You have to let it build one project, wait, then open another project and click Build, wait, then open another project and click Build, etc… This is an intolerable problem with many (most?) HATs out there. I draw attention to WWP2003 not because it is unique in this respect, but because it is the HAT I am currently using.

The problem with this approach is twofold: First, since it essentially locks FrameMaker, you can’t continue your development work while it is building. This means that even if WWP2003 is building Project 1, you can’t even work on Project 2 until it finishes building. Ack. Secondly, if you need to build multiple projects, you have to actively monitor the build process so that when it finishes building Project 1, you can manually open and start building Project 2. This sucks…but there is a way around it.

Virtualize your Build Environments

You can virtualize your build environment and free yourself from the tedium of waiting for your projects to build. If you haven’t yet worked with virtualization, it sounds much more complex than it really is. Virtualization essentially allows you to run a “virtual” operating system in a window on your host operating system. This host operating system can be your development computer, or it can be hosted on another server or computer in your environment. Though the virtual operating system runs in a window on another operating system, it is for all intents and purposes another computer…only a virtual one, since it shares the hardware of an already existing computer. OK, it still sounds pretty complex if you haven’t done it before…but trust me, it is pretty easy and your learning curve is likely to be more than a few minutes once you see how it all works.

When virtualizing your build environments, you’ll need to make sure you do a few things right off the bat:

1) Mirror your directory structures
What is important here is that you mirror your documentation directory structures on your Virtual Machine. By this I mean that if you keep all your files for your projects in D:\Projects\Layouts and D:\Projects\Artwork…you’ll want to create duplicate folders (even if empty) on your Virtual Machine.

2) Install your HAT software
Install your HAT software on the virtual machine. If you are using WWP2003, you’ll need to install WebWorks Publisher Pro 2003 and FrameMaker. Depending on the licenses of your HAT software, you may have to shell out some money for this. If you need to justify the expense of this to management (who doesn’t?), just figure out how much time it takes you to build each project and figure out how much time you’ll be able to recapture per week, month, year, etc… by being able to work and build your output simultaneously. The RoI should be pretty low and easy to calculate.

3) Install Microsoft SyncToy
Microsoft SyncToy is an “intelligent” file copy tool. What’s more, it is free. There are only a few options to it, and even fewer that you’ll need to use. I’ve set mine up to copy the root directory of my development folder structure (D:\Projects\) using “Echo” copy. What this will do is mirror all the files and folders on the D:\Projects directory of my development machine to the D:\Projects directory of my Virtual Machine. If I delete a file on my development machine, it gets deleted on my virtual machine the next time I run SyncToy. If I create a new file on my development machine, it gets copied over to my Virtual Machine the next time I run SyncToy. What’s more important, SyncToy is smart enough to only copy the files from my development machine that are no longer identical to those found on my virtual machine. Since my development folder contains about 12Gigs of FrameMaker files and source (.eps) graphic files, its a relief not to have to copy ALL of them over when I need to make some new builds.

4) Keep it lean
You are going to be using the VM only to build your projects. You don’t need to install your favorite media player, or your favorite image editing software. Only install those components that you need to actually build (not develop) your projects. Use Add/Remove Programs to remove any crapwhere that gets installed by default.

Once you’ve set up your Virtual Machine, you are pretty much good to go. When you want to build a bunch of projects, just launch your VM, use SyncToy to sync your development files to the VM, and start building your projects. You can continue working on your development machine because your HAT on your build machine is working from a duplicate set of files.

There are, of course, some drawbacks to this approach. The first is cost. Due to licensing restrictions, you may have to pay for extra licenses for Windows and your HAT software. For me this was a non-issue because we have access to licenses for Windows through our development group (of which I am a part) and we had some older licenses for our HATT software that were not being used. Your mileage may vary though. The second potential problem with this setup is resource consumption. If you are hosting your Virtual Machine on your development box, you’ll want to make sure you have lots of RAM and some extra CPU cycles to spare. I’ve done this in the past and while I’ve found my development machine to be somewhat slowed down, it was still perfectly usable so long as I didn’t open Photoshop and decide that I wanted to edit a bunch of .eps files. If possible, consider hosting your VM on a spare computer somewhere. Your engineering group may even have some servers dedicated to hosting virtual machines somewhere. Depending on their hardware, you may even find building your projects on the VM is faster than building them on your production machine!

Some of you might be asking the obvious right about now, “If I have an extra computer laying around, why should I much around creating a Virtual Machine?” Well, the answer to that will depend on a few things I suppose. First off, there’s no reason why you can’t just use your spare hardware as duplicate build machine and use sync toy to keep them both in sync. In 99% of the cases, using dedicated hardware will improve your build times. However, with a Virtual Machine you will have increased portability. As you retire and aquire new computers, you won’t have to consistenly recreate a duplicate build machine. All you’ll have to do is copy your VM over to the new computer and launch it. Going to be on the road for a while, just copy your VM to a laptop and take your projects and HAT with you. For me, the portability issue outweighs the speed issue. Once I offload my builds to my Virtual Machine, I don’t really care that it takes 25% more time to build since I can continue working while things are building. If I have something that needs to be built ASAP, there’s no reason why I can’t build it directly on my production machine…and sometimes I have to this.

One last thing I’d like to mention in regards to “Why Virtualize?” This may be a huge deal to some, and not at all interesting to others…but the last reason to consider virtualizing a build machine is Operating System Independence. When you have a VM, you can run your Windows build machine on any OS that can run a virtual machine. (…providing that you stick within the same virtualization player vendor. For example, I wouldn’t expect to be able to run a Parallels Virtual Machine on VMWare Player.) . At home, my OS is Ubuntu…but I can still work/build my projects on Linux using VMWare server to launch my virtual build machine. As of today, VMWare provides virtual machine players for a host of operating systems including Mac OS X, Linux (many flavors), Windows, Sun, etc… Chances are that if you use it, there’s a VM player for it.

The one downside not addressed thus far by using a Virtual Machine is the fact that you still have to manually monitor your VM if you are building multiple projects. This is somewhat alleviated by the fact that you can still work on your development machine while building projects on your virtual machine…but is still somewhat of a pain. Fortunately, there is help in this regard.

Automate your Build Processes
You can “easily” automate building multiple projects using some pretty simple scripting tools. I kind of fell in to using AutoIT to automate my builds, but there are other tools out there to accomplish the same thing. I’m working on an article right now that explores using AutoIt for build automation. I’m not done with it yet, but if interested, keep your eyes on the blog.

Resources Mentioned in this Article

DataHand Defunct?


Sad news, folks. It looks like DataHand Systems is teetering on the edge, and is no longer selling the futuristic DataHand keyboard. Their rather cyptic message on their webpage says:

DATAHAND SYSTEMS, INC. ANNOUNCES IT IS NO LONGER MARKETING AND SELLING THE DATAHAND ERGONOMIC KEYBOARDUnfortunately our supplier has advised us they can no longer produce the DataHand Ergonomic Keyboard and until a new manufacturer can be identified and is in production, the company will no longer offer the DataHand Keyboard for sale. The 90 day warranty will be honored for units shipped since October 28, 2007.

There is no mention if there is any hope of finding another manufacturer.

 

 

DataHand Tombstone

 

Why PDFs Suck

Though it stands for Portable Document Format, PDF might as well stand for Printable Document Format. That’s because printing is just about the only thing a PDF is good for.

OK, that might sound a little harsh… but for reasons well elucidated elsewhere, PDF is a poor choice of document if:

  1. The document is longer than a few pages in length
  2. The document is going to be read online

Most technical documentation for software falls into the above categories. This isn’t to say that there is no place for PDFs in technical communication…it is just to say that PDFs should be used for what they do best: facilitating the printing of content.

When I try to read PDFs online, I regularly encounter the following:

  • Acrobat hangs Firefox. Yeah, isn’t it lovely having to kill the Firefox process, relaunch Firefox, find the page with the link to the PDF, and try to open it again just to find that one piece of information I need to accomplish what it is I am trying to do.
  • Acrobat hangs itself. This doesn’t happen as often as the above, but it IS still frequent enough to give me that “will it work this time?” feeling any time I open a PDF.
  • In addition to that, if I do get the PDF open without crashing anything, I have to search the document. Opening the PDF to the cover or title page doesn’t do anything for me. Searching, of course opens a sidebar search that inevitably obscures some of the content. This sidebar, is, of course, persistent… If I go to search another PDF document, the same search term from the first PDF is still in the sidebar.
  • Some PDFs are constructed to dynamically download content from the web as I jump from page to page. This is infuriatingly slow and cumbersome.
  • Using the sidebar scroll control to scroll vertically throughout a PDF document jumps between pages. Why can’t it scroll the document as I move the location bar?
  • If the PDF opens embedded in the web browser, it breaks all sorts of usability features I’ve come to rely on. What does File->Print do in the menu? What does File-> Save do in the menu? Why doesn’t it ever do what it should?
  • It inserts another toolbar in the browser that I’m not used to working with. On top of that, it’s cluttered with buttons I never use (unless it’s the Save button that I inevitably use AFTER trying File-> Save).
  • Toolbar buttons use non-standard metaphors. Why is the Search button a set of binoculars and not a magnifying glass like every other Search button? Every time I look for the Search button in Acrobat, I see the binoculars and it registers as some sort of Zoom feature and not a search feature.

The best solution, of course, is to provide content in both HTML (familiar web-paradigm) and PDF format. This lets users access the web version of content for reading online, while providing them an effective mechanism for killing lots of trees if they want to print out the whole thing.

If you are writing technical documentation, single sourcing to both web and PDF should be on your roadmap…as it is the right thing to do for your readers. If you have to choose between one format or another, think long and hard about it. Go PDF-only and you make all your documentation that much less accessible, but placate those who would want to print it. Go HTML-only and you’ll be doing right by your users (even if they don’t know it), but prepare to hear people complain if you don’t provide an easy mechanism to print out all the content (a feature that is sorely lacking from most web help systems).

Top 12 Non-Expert and Non-Solicited Pieces of Advice on Technical Writing

12) You should not type with your eyes closed.
— OK. Fair enough. Nobody is saying that you should type with your eyes closed. However, there is no reason why you can’t type with your eyes closed if you want to (providing that you know how to touch-type). If you’ve ever been inspired, or trying to transfer a perfect model of a concept held in your brain through your fingers and onto the screen, you’ll know that everything else in the world is nothing but a distraction. …and you’ll realize that sometimes, the best thing…and perhaps the only thing that you can do to avoid losing the thought is to close your eyes and write. Of course, if you’ve never had this experience, there is no reason why you’d ever need to or want to close your eyes while typing…and even less likelihood that you’d understand what someone was doing if you saw them typing with their eyes closed.

11) Dense content is better because it is shorter.
— If dense content increases complexity, cognitive load required to understand the topic, diminishes the accessibility of information within the content, and generally destroys any semblance of usability, then dense content is definitely NOT better. I’ve had more than one person (none of them writers) hold up the most compacted, impenetrable piece of gobbledygook to me as a model of how technical content should be written, and roll their eyes when the same content is revised for usability and understandability. In this context, the opposite of dense content would be content that has been revised by breaking information out into task, reference, and concept sections; content that has been reworked by chunking information, making use of parallel structure, increased whitespace, logical headings, etc… Such content often has a higher aggregate page count than the same information compacted into a block of text so tight that it is near impenetrable for the reader (and damn near impossible to maintain and change without springing the whole works).

10) Technical writing is easy. Just do it this way: _____________.
— We’ve all been there. Schmitty’s Law states that the less informed someone is on the theory or praxis of technical communication, the more amusing “their way” will be. The corollary to Schmitty’s Law states that the more amusing “their way” is, the more vehemently they will argue for the immediate adoption of their approach and the less willing they will be to consider other approaches.

9) A low page count is more important than comprehensive content.
— “Anything said in 40 pages would instead be better said in 10. (Regardless of the scope of material covered.) If it is not possible to decrease page count, consider changing font size, line spacing, and page margins to accommodate the requirement.” If you’ve ever been the recipient of such myopic advice, you were probably damn near apoplectic thinking of how to even respond.

8) Users don’t read documentation.
— Even in the face of direct evidence to the contrary, people will still spout this adage and attempt to use it in order to justify doing “very bad things” to the resident technical writer, or asking him/her to do “very bad things” to the reader. An example of this would be, “Since users don’t read documentation anyways, let’s remove all task oriented procedures from the online help.” Of course, the kernel of truth to this statement is that a) users quickly learn to avoid poorly written, dense, and inaccessible documentation and b) users do not read a user’s guide, online help system, etc… cover to cover. They look for topics relevant to their current context.

7) You don’t need to know when the software will be released in order to schedule your efforts.
— Documentation should “just be ready” whenever the software is ready. This is sometimes called the jack-in-the box method of technical writing. That is, the tech writer cranks and cranks and cranks away on his/her projects and one day, SURPRISE! The software is posted to the web and we’ve released with nary a warning.

6) You don’t need specifications to write software documentation.
— Our software is so easy, you don’t need specifications. In fact, users probably won’t even need to read the documentation anyways…so you shouldn’t need a specification to write it. This is a classic, and is usually the mantra of the engineers and experts who design the software in the first place. Needless to say, more often than not, the software is pretty darn complex when mere mortals are asked to use it.

5) You shouldn’t use “you” in technical documentation.
— Use of “you” is not formal enough for technical documentation. This seems to be a holdover from fifth-grade composition classes devoted to some ridiculous paradigm of “formal writing”. Granted, there is no need to superfluously use “you” if a reference to the reader is not needed, but in some situations you can’t avoid using it without sacrificing clarity and usability upon the alter of some misguided notion of formality.

4) The imperative voice is insulting to readers.
— A task or procedure step should avoid being constructed in the imperative voice, “because it’s insulting.” I was told this by someone who then added, “I mean really, who are YOU to tell ME what to do!?!?” Uh, ok….

3) Users don’t want to be told how to do anything. Instead, they just want the bare facts, and they will figure out everything else on their own.
— Despite all the research to the contrary, this lovely gem comes up again and again. Reference material on window and system objects is all anyone needs to figure out everything they can do with the software. This bit of advice is often used in conjunction with numbers 4 and 2 in an attempt to advocate for number 11.

2) Everything can and should be reduced to a diagram with callouts.
— There’s no need for tasks/procedures. All a user needs is a screen-shot or diagram with callouts describing the main elements of the application/window/procedure/etc…. Everything else, the user can figure out on his/her own.

1) You are not allowed to cut and paste content.
— Yes, I encountered this little bit of wisdom quite recently. I was even told I was “fooling readers” by making them read something twice, and that in general, “cut and paste is evil”. That “cut and paste is evil” is, in fact, an adage that I agree with…when it comes to writing code. However, look at any number of competently written online help systems, user guides, etc… and you will see that content re-use is an essential element of technical communication. The idea that one cannot or should not cut and paste in technical documentation is about as ridiculous as it is misinformed. It’s even more ridiculous when many a HATT has built-in support for managing content reuse.

Long Term Review – Adesso TruForm Pro

The Adesso TruForm Pro has a lot of promise, unfortunately it just didn’t seem to live up to it all.

Adesso TruForm Pro

I’d like to start this review with a note to hardware manufacturers: It is 2007. Stop making devices that are incompatible with USB. Enter my first gripe with the Adesso keyboard: Incompatible with USB.

On the one hand, I’m reluctant to hang them from the halyards due to the fact that Adesso didn’t make a direct claim that the keyboard was USB compatible. I took it on faith that I could use a PS2-USB converter and all would be well. Since Adesso does pimp their own PS2-USB adapter, one could assume that this keyboard would work via USB. And it does, sort of….

The problem is with the touchpad. When the keyboard is plugged in to a USB adapter, and subsequently plugged in to a computer, the touchpad provides its basic pointing functionality. However, any advanced features of the touchpad are unavailable until it is plugged directly in via the PS2 connectors. That’s kind of a bummer, as I use multiple monitors and without even being able to adjust the touchpad sensitivity, I have a hard time easily pointing, dragging, etc… across my entire desktop.

Adjusting to the Keyboard
Adjusting to the TruForm Pro was essentially a non-issue. Anyone familiar with working on a MS Natural or most other split keyboards will likely find themselves getting back up to productive speed within a day or less. If you are used to working on a traditional unsplit or other “radical” keyboard design, it may take longer. It took me about a day to acclimate for typing, and rather longer to acclimate for pointing via the embedded touchpad. (More on this further on.)

Media Keys
The Adesso comes with a bevy of media keys (play, pause, volume, etc) splayed out in a row above the Function keys. In addition, it comes with some system keys above the number pad. Frankly, I didn’t use the media keys at all and would have payed extra for a keyboard without them. The system buttons (above the number pad) I didn’t use either, so I’m not even sure if they worked. However, I think it’s a mistake to have such keys on a professional keyboard. Not only do they (and media keys) take up extra space on the oft-overcrowded desktop, but the idea of inadvertently shutting down my computer by pressing the wrong button sends shivers down my spine. (OK, part of this fear is because my XP box doesn’t always act gracefully when it is told to shut down. Often it will proceed to shut down, regardless of whether or not there are open documents, etc…). So, I’ll give the media keys a rating of “Useless” and the system keys a rating of “Useless, possibly dangerous”. I do so with the knowledge that undoubtedly some people think these keys are the greatest thing since sliced bread. To each her own!

The Feel
Adesso’s keyboard has quite a unique tactile feel to it. The keys themselves are comfortable, with a slightly convex shape to each that provides the ability to properly align your fingers without looking at the keys. “F” and “J” keys have an embossed dash on them, assuring the touch-typist the ability to properly orient each hand on it’s home row.
All and all I really enjoy the feel of the keys…the cupped shape of the keys, is just enough to notice and scores big points (my preference).

Tactility
The actual act of typing with the keyboard is kind of underwhelming. The key action is, essentially, what I call “Jello Type”. When depressing a key, there’s no real break to indicate when the key has been successfully pressed. As a result, the key press registers at some magical level that takes a while to get used to. This keyboard gets low marks for tactility.

Touchpad

I really like the idea of a pointing device integrated directly into my keyboard. However, I’m somewhat split as to how well the Adesso pulls it off. I do like the integrated touch pad, but over the year I’ve used this keyboard, I never really got used to it. Most of this problem is the result of the touchpad functionality being reduced to its most basic mode when plugged in via a PS2-USB converter. Up until the end of my trial with the Adesso, I still preferred to use my Kensington trackball for mousing. I’m willing to concede that if I ever got the touchpad working properly, I may have found it more natural to use.

When I did use the touchpad, I found it natural and convenient to use either thumb to point, thereby minimizing the movement of the rest of my hand from the home position. This was really only useful for gross pointing activities like selecting between open windows and such. Any type of fine pointing work (pixel editing in Photoshop, designing in illustrator, and even dragging files between folders) required more dexterity and accuracy than my thumbs can offer, so I’d wind up moving my right hand off of the home position in order to point with my index finger. Inevitably, once I realized I’d switched my hand off of home in order to use the built-in touch pad, I’d often just wind up giving up on the touchpad and using my trackball. This isn’t really a limitation of the keyboard…I’d attribute it to my natural preference and the fact that I can work faster and more accurately using my trackball. Those who prefer working with touchpads may have a different experience entirely.

Clicking either left or right mouse button on the touchpad was easy and intuitive with either thumb.

Size and Split
Overall the keyboard seems quite big. It has roughly the same horizontal split to it as does a Microsoft Natural (at least my hands did not notice a significant difference between the two). Still, it is quite a bit larger in both height and depth. Fortunately, much of the dimension seems to be the result of accommodating the built in touch-pad at the bottom and the media keys along the to, so it is natural that it is bigger than its peers.

Support
I’ll give Adesso high marks for their support. When I wrote them regarding the problem with using the touchpad when plugged in via a PS2-USB converter, I actually got a human to respond (the same day). We continued a friendly email exchange for a few more days regarding the USB support problem, but ultimately, that dog just won’t hunt. This is the only keyboard manufacturer I’ve ever bothered trying to contact support for, and I was pleasantly surprised by my experience. I’d like to mention this and give them high marks in this area, since it seems to be antithetical to the general trend in manufacturer support we’ve seen over the last few years.

Overall

Overall, the Adesso held up very well for the year+ I used it at the office. I have no complaints as to its build quality, and it held up like a champ. I used it to author and maintain 30+ technical manuals (3-5Kpages probably), so it got a workout. There were some situations where I did enjoy the integrated touch pad, it just didn’t prove to be quite the boon that I had hoped it would be. Honeslty, I would have continued using the Adesso had it not been for one simple fact: my hands would no longer let me.

After a year, I had roughly the same amount of pain working on the Adesso as I did before I got it. It seemed to do the trick for a while though. At the end, I had taken to pulling off the fruit stickers off my lunches and sticking them across the palm rest of the keyboard. Over time, the wear pattern on the stickers indicated that my hands wanted to be further apart than the split in the keyboard allowed. So, I retired the Adesso to my home, where I use it occasionally when I’m working on my laptop.

Conclusion and Rating

This is a tough one. I think for someone who can connect the keyboard to their computer directly via the PS2 ports, it could perform admirably. However, I have a strong personal dislike for the typing action of the keyboard. Initially, it did seem to help with the carpal tunnel pain, but after a year of use, the pain was back in all its glory. I will recommend this keyboard, but with reservations. It has some issues, the most significant of which reduced my rating by a few points. When all is said and done, I’d give it a 7/10.

**Update**
I see that Adesso has released a newer version of this keyboard that now provides USB connectivity. In light of this, I would give the USB version of this keyboard a rating of 8/10. This, of course, assumes that the functionality, build, and feel are identical to the model I reviewed and that everything works as it should.

ePublisher User (as in content consumer) Reaction

Looks like Quadralay’s WebWorksPublisher (ePublisher) output is not endearing the users.

I work with WebWorksPublisher Pro 2003 and tried ePublisher, but quickly grew tired of the bugs they kept telling me didn’t exist. Finally I gave up on it and stuck with WWPP 2003. Its output is rough, but it plays well with Frame and does some things nicely. Still, Quadralay won’t get another dime from me.

http://forums.worsethanfailure.com/forums/thread/116913.aspx

Oops…I guess you’d call my post venting. I was, however, interested to see an end user’s reaction to a helpset created by the oh so venerable tool.