There are two webcasts planned for November that should be of interest to DPM customers (and those still considering whether DPM is right for their business).
Microsoft Webcast: Protect Critical Data While Lowering IT Costs with Microsoft System Center Data Protection Manager 2006 and HP
Thursday, November 10, 2005- 9:30 AM Pacific Time (US & Canada)
How much down time can your business afford? More than 50% of companies that experience a major data loss will close their doors forever. Increasing rates of data growth, higher service level expectations, and tighter technology budgets have left IT managers with fewer resources to protect critical data. Join HP and Microsoft for a webcast that presents strategies that you can use to protect your data while lowering IT costs. Learn about the HP ProLiant Data Protection Storage Server, a disk-based backup solution that is easy to deploy, powered by Microsoft System Center Data Protection Manager 2006, and built on the ProLiant server platform.
Presented by Dave Ryan, Sr. Product Manager, Microsoft Corporation, and Brad Parks, Product Marketing Manager, HP
TechNet Webcast: A More Secure and Well-Managed Infrastructure (Part 12 of 18): Managing and Protecting Storage (Level 300)
Wednesday, November 23, 2005 - 9:30 AM - 10:30 AM Pacific Time
As an information technology professional, you spend a significant amount of time managing storage. Are you combining this task with sound security practices? This webcast discusses the management and protection of storage inside an organization, why a storage and recovery plan is vital, how to plan for storage and recovery, and what features in the operating system can help you successfully implement your plans. This session also introduces Microsoft Data Protection Manager, what it is, and how to implement and use it. The presentation includes information about Microsoft Windows Server 2003 Windows Rights Management Services as a method for securing information on the network.
Presented by Kai Axford, Security Specialist, Microsoft Corporation
Selena Sol at extropia replies to the question, "What would you want to see in software manuals/technical writing to further disassemble the mystery?" with a thoughtful discussion of what is needed in documentation, "putting the writer back into technical writing".
Being in that field myself, my first thought was that these were pretty basic guidelines -- everyone knows all this by now, yes? On the other hand, consumer complaints continue. I looked around for something recent and technical to evaluate...
Last week I bought a headset phone, for those lengthy conference calls with India late at night. Setting it up and using it was slightly frustrating, but I did get it working. So I took a look at the instruction manual again tonight with my technical writer hat on.
The first section lists all the parts, with illustrations. Nicely done. This is useful. But they flag the battery item with a note: "The battery is located inside the cordless phone battery compartment and must be connected correctly prior to charging." What am I supposed to do with this information right now?
Right after the note begins the second section, "Setting up". Long item for jack requirements and another for installation guidelines. Aha, the next page covers connecting the battery, finally, but there's no task listed for charging the battery. (The content is there, just as a note to "Connecting AC/DC power". This could all be reorganized to be easier to follow.)
So it says not to connect the phone jack until the battery charges for 12 hours. But the step of placing the phone in the base for charging is in the task of "Connecting the telephone line" - so which is it?
Jumping to the next section, I find "Features and Customization". I'd prefer tasks. I understand the procedure for area code selection, but not why I'm selecting an area code. In "Telephone Operation", there are some tasks (making/answering and ending calls, all lumped together) and some features (Ringer Switch, and Call Timer).
So the lesson for me to keep in mind is nicely stated by Sol: "Your readers did not pick up the documentation to learn about the product. They picked it up because they needed to do something with the product."
(Someday I'll blog about exceptions to that...)
Esquivalience—n. the willful avoidance of one’s official responsibilities
Only, there's no such word. The fabricated esquivalience was included in the recently published second edition of the New Oxford American Dictionary as a copyright protection strategy.
This caught my attention because we're working on the vocabulary for the next version of DPM, the terms and definitions that will populate its glossary. And thinking about esquivalience as a fabricated term made me conscious that we repurpose words much more frequently than we create them. I haven't been able to identify a single term that I've encountered in Microsoft's products that was a new, invented word, even when it might describe a function or action or process that no existing word described, which made me think about why we choose to redefine rather than to create.
And that path of research led me to Human computer interaction theories and the idea that, in general, software is designed around a story of user actions that rely on metaphoric symbols to evoke familiar concepts. The action of putting a file in a recycling bin on a computer isn't the behavior you perform when you put a file in a recycling bin physically, but the underlying context -- "I am done with this file and want to get rid of it" -- is familiar to the user. So we repurpose words that users can extrapolate to a new application.
For example, take DPM's file agent. This phrase uses conceptual combination to leverage familiar meanings into a new concept. We might just as easily have called it a grollep, and defined grollep as "Software, installed on a file server, that records changes to protected data in a synchronization log, and transfers the log from the file server to the DPM server".
We didn't. We didn't discuss whether to invent a word. It was taken for granted that we would add new, specialized definitions to familiar words or combinations of words. And ease of learning is a compelling argument for that practice. But now I have a new question to ponder in idle moments: Would there ever be a software application that could find no metaphor?
Here I am, partway through the exams for the MCSA, and we announce "a new generation of Microsoft certifications". I click frantically through the FAQ, trying to find out if my efforts have been wasted. Then, a sigh of relief -- I'm not wasting my time and should continue along this track to complete the MCSA.
So, now that I've been reassured that current certifications are still valid and recognized, these new plans actually look quite interesting. To quote, from the FAQ:
"The program introduces three series of credentials:
Technology Specialist (MCTS): This series validates core technology and product skills, such as how-to or implementation skills.
IT Professional and Professional Developer (MCITP and MCPD): This series validates specific job-role skills outside of core technology, such as operational processes and procedures, and analyzing business problems.
Architect (MCA): This series validates the skills required to successfully impact business IT. These include technical breadth, technical depth, communication, strategy, organizational politics, process, and leadership."
The first credentials and certifications to become available are SQL Server 2005, Visual Studio 2005, and BizTalk Server 2006. For Windows folks, "When Microsoft releases the next version of the Windows client and server products, the new credentials will follow the new structure, with a Technology and Professional series. MCSEs, MCSAs, and MCDSTs will be offered an upgrade path to the new credentials."
I found this Q&A interesting also:
"Do the job-role certifications cover only Microsoft technologies?No. Although Microsoft technologies will remain the primary focus, we recognize that our technologies are often used in heterogeneous environments. The Professional Series credentials require candidates to understand core job concepts and have the ability to work across multiple environments, while focusing mainly on Microsoft technologies."
The more I read on it, the more promising this sounds.
The transcript from the executive chat session, System Center Data Protection Manager Revealed (Oct 13, 05), is now available online.
A few features of the next version of DPM are mentioned: support for SQL, Exchange, and clustering. We've had a lot of customers asking for those, so "V2" should be very popular!
Here are two ways to evaluate DPM...
By the way, if you already tried the beta edition of DPM, you may recognize that some of the differences in the eval edition reflect feedback from the beta test. But please don't try to upgrade your beta product to the evaluation edition - it won't work. You'll have to do a new installation with the evaluation edition.
If not, it can be worth learning. As a system administrator once told me, "If you have to do it more than once, run a script."
The Scripting Guys at Microsoft provide great advice and resources for scripting, including the upcoming Scripting Week 3, a series of webcasts that will take place Oct 24-28. (Each webcast will be available on demand later, if you can't make it to the live version.)
As if the webcasts themselves weren't enough, check out the goodies on that site: 40% off on selected Microsoft Press books, giveaways (including several Windows Server 2003 Resource Kits), and the opportunity to win a trip for two to Seattle that includes a day on the Microsoft campus and dinner with :::drumroll::: the Microsoft Scripting Guys!
Documentation typically goes through numerous passes, between writers and technical experts and editors and peer reviewers. And those passes can accumulate a lot of revision marks and comments. All part of the work process. But let's suppose your product team is working with a partner who needs an advance look at content in progress...
You're probably wondering what I forgot to delete when I sent it out, and I'm not going to incriminate myself. However, I am going to let you know about a great tool I found through the Useful Technology Blog that will clean up your Word, Excel, and PowerPoint files. The applications themselves have this functionality, of course, but you can run this tool from the command-line to clean up multiple files at once - big plus there!
The tool is the Office 2003/XP Add-in: Remove Hidden Data. Don't forget to save a copy of the file with all that hidden data in case you need it again later!
There's an old song (70s) with a chorus of "Things get a little easier once you understand." (Okay, I'm dating myself by admitting I remember when the song came out, and it's a really bad song, but the thought fits here.) Things can be easier...to accept, to work around, to deal with...when you understand the reasons.
For example, throughout our content, we tell you that DPM is designed to run on a dedicated, single-purpose server. The server must not be a domain controller or application server.
But why? The question came up in our newsgroup, and I thought the answers worth repeating here.
You can't install DPM on a domain controller because DPM creates its own low access local accounts for security purposes, and it cannot do so on a domain controller.
Security is important, granted, but we don't just say "no domain controllers", we insist on "a dedicated, single-purpose server" -- again, why?
When I joined the DPM group, I started off digging through the specs to learn about the product and I discovered a wonderful document called "Master Scenarios". What this document did was tell the story of a fictional customer and DPM.
I had read other content on DPM before this, marketing materials and Help-in-progress and such, and from those I'd begun to grasp how parts of DPM worked and formed a fuzzy idea of how it was intended to fit into a business solution. But it wasn't until I read "Master Scenarios" that the light bulb came on.
In this story, a make-believe company decides to use DPM, and then a make-believe administrator -- we'll call him Joe -- deploys and manages it. "Master Scenarios" details each decision Joe might make and each task he might perform, describes how the product will perform and how Joe will interact with it. For instance, "Joe selects the Reporting tab. Because Joe has not configured reporting for DPM yet, he sees a dialog box for configuring the credentials for reporting." (I'm rephrasing from memory, btw.)
The document is a long story, since the goal is to capture the entire experience of using DPM, and it provides a framework and vision for the development of the product itself. For me, it brought all of the features together into a coherent framework. I still had all the technical details to learn, but because of the "Master Scenarios", I would then be able to place those details in their proper context and understand their relationships to the user experience.
If you've read Chapter 1 of the DPM Operations Guide, you figured out where it came from several paragraphs ago. It's a condensed version of the "Master Scenarios" approach, telling the story of what Joe goes through in administering and supporting DPM. The intent is to create a framework of expectations to encompass the detailed information in the rest of the Ops Guide. Hopefully, customers will find it useful. (No complaints yet, which is a good sign...)
I'm a senior technical writer on the Windows Server Central Services team, working with the System Center Operations Manager product group. I started at Microsoft in 1997 in Training & Certification as an instructional designer on the Windows 98 courses. After many years in T&C (which is now Microsoft Learning), DPM offered me the opportunity to work directly with a software development team...and isn't that what Microsoft is all about? After two releases of Data Protection Manager (DPM), I transferred to the Operations Manager team to focus on management pack guides.
A really common annoyance (for me) is trying to rename, move, or delete a file, only to be told that I can't because "something" is using it, but the error doesn't tell me which "something" it is and most of the time I can't figure it out. I'll open Task Manager, Processes tab, and stare at the CPU and mem usage columns but it doesn't help. I shut down processes I don't recognize. I reboot. I give up.
Microsoft.com Operations blog to the rescue. Today's post on tools includes a link to a tool that sounds to be perfect for my problem -- Process Explorer. "Useful for looking at a processes dependencies and any open handles a process has. Handy in cases where a file is in use, and you’re not sure what’s still holding on to it." Is that made for me or what?
Lots of other useful information (and links) over there, take a look...they might have that tool you would have looked for if it had occurred to you that it existed.
Today, I'm thinking about product Help, DPM's in particular. Mostly I'm thinking about the questions I want to find answers for next week (because I'm taking the rest of this week off).
For instance, what was our guideline in DPM 2006 for deciding what would go in Help and what would be documented in other forms? DPM was well into development when I joined the team last winter so I missed out on the content planning phase.
What guidelines do other products use for selecting Help content?
What research has been done around usability and customer preferences?
My manager probably has most, if not all, of those answers, and I'm not going to try to reinvent the wheel. Still, it's a fun mental exercise: how would you define Help? Would you include everything that could possibly be known about a product, and if so, how would you make it all discoverable? If you would focus it, what would be your criteria for inclusion?
Those are only pseudo-hypotheticals...if you'd like to actually offer your opinions, feel free to send them to firstname.lastname@example.org.
Yesterday, while I was watching a usability session, the participant mentioned that he prefers to open up a new tool and just poke around, rather than reading the documentation. I can empathize with that approach -- I do it myself sometimes.
"Explorer" types like to jump right in and poke around. Maybe try to do something to see if they can, maybe try to do something just to see what will happen. Can they figure it out just by following the clues? That's a fun quest. Metaphorically like hitting the open road and taking turns on impulse. And if you get too lost, (yes, there are degrees of lost!) you can always reach for the map.
So for explorers, documentation is useful when it satisfies the same purpose as a good map: contains both the big picture and the details, clear labels, easy to navigate, and most importantly, is complete. (Maybe I don't need to know there's a river alongside the road, but I want to know.)
Fortunately, explorers are usually accustomed to adjusting to the arbitrary limitations of linear information. They realize that an unfamiliar term is only undefined because they skipped the previous five chapters. I can't count the number of times I've read a manual backwards because I found the information I wanted on page 236, but to understand it I needed the explanation on page 198, and it relied on a definition on page 147...you get the idea.
I don't expect that the content designer can predict what I will want to know in the context I want to know it, not for me and also the zillion other people who might use that content. A more reasonable expectation is that the content designer will try to include everything I need to know, much that I want to know, and make it all easy to find.
Not too much to ask, is it?
Please join our execs for a live chat on DPM on Thursday, October 13th, at 10:00 A.M. (Pacific). See more details at: http://www.microsoft.com/communities/chats/default.mspx#05_1013_TN_EC
...for an open discussion on the newly released backup and recovery solution from Microsoft - System Center Data Protection Manager (DPM). Learn key usage and deployment scenarios that can help protect your business critical data and discover how this new product will enable rapid & reliable recoveries in your IT environment. Also, learn how DPM can help you lower your total cost of ownership with efficient and near-continuous data protection. This is a key opportunity to provide feedback, ask questions, and get answers on the current and future versions of Data Protection Manager.
Before yesterday, I'd only used System Restore a few times: when I was trying out a lot of changes and wanted to be sure I undid them all, and once when I hit something in Internet Explorer that made it look funky and got tired of trying to troubleshoot it. Oh, I also used it a few times over the phone, helping out family members, when it was easier to talk them through System Restore than to try and figure out what they'd done to their computer and fix it. Very handy feature, I thought.
Then yesterday, I had one of those supremely ugly installs occur. The kind where your only way out is a hard boot. And when I powered it back up, I got a lovely message that my computer was now in "an unstable state". It recommended I uninstall the incomplete-install. And when I tried that, the "New hardware found" balloon went berserk, suddenly finding every bit of hardware in the box. Seriously. About the only hardware it didn't rediscover was the screws. And on my next reboot, I was reminded that I only have 3 days to activate Windows (because of all the new hardware).
So I opened System Restore, picked a restore point from the day before, and crossed my fingers. For all I knew, whatever changes had just been made on the computer could be out of the scope of System Restore's reparative powers. I consoled myself that I had a recent backup and, though it would be a major pain, I could eventually put Humpty Dumpty back together again. But I really didn't want to go through all that.
And I didn't have to. System Restore did the job. To everyone who contributed to that feature, thank you!
One of the most interesting aspects of working on the next version of DPM is that we -- the writers -- are here in Redmond, and the product team (a great group of developers and PMs) is at IDC in Hyderabad. That's a 13.5 hour time difference. Except we're still in daylight saving time so it's only 12.5, right? Either way, it's tomorrow there during my tonight -- that much I've got straight.
We've a list of ways that we'll use to manage collaboration across continents and time zones. Managing our email expectations, for one...not likely to get instant replies, unless we're online late at night (like now, though my excuse is I'm keeping busy until it's time for our conference call). Conference calls, juggled so they're not scheduled too late for us or too early for them (and I much prefer evening to getting up at o'dark thirty). Posting documents to a server and using threaded web discussions. Of course, nothing will replace the time-honored tactic of cornering a developer in his office and refusing to let him out until he reviews your file...
Tonight's conference call is a discussion of one of the wizard specifications. Have you ever been working in an application and wished you could turn around and ask the person who created it, "Why am I clicking Next again??" Well, this is when we get the opportunity to ask things like that, and it's very satisfying!
Now that DPM 2006 has been released, it's time to do it all again. We had our content kickoff this morning for version 2 of DPM. To me, this is the anything is possible stage, when we can air our most grandiose visions. "And the user chooses a scenario - click - and every chapter he needs is instantly assembled (and none that he doesn't need)."
This is also the stage where we need to examine everything we've done previously and figure out how to improve it. (Triage of those improvements comes later.) So, if you're a DPM customer and have feedback on any of the DPM content (Help, error messages, wizard screens, UI text, Planning & Deployment Guide, Operations Guide, Management Pack Guide), please email your ideas to Dpmfdbk@microsoft.com.
*ASR = Automated system recovery
In our documentation, we recommend, "To ease recovery in the event of system partition failure, install DPM to a partition that is separate from the system partition." We also recommend that you create an ASR set to back up the server system state. How are these recommendations connected?
It has to do with the way DPM works. The DPM program files include mount points to the replicas and shadow copies in the storage pool. If you install DPM on the system partition, then the ASR (or other method of system state backup) will also back up the files through the mount points, resulting in a huge ASR that will fail on restore because the mount points won't be there.
So if you followed our recommendations, you installed DPM to D: (or other drive letter as appropriate) and then the mount points are also created on D:. You create your ASR set from C:, avoiding the mount point issue entirely.
(The ASR isn't your DPM backup though. Check out Using Data Protection Manager and Windows Backup and Archiving and Restoring Data for instructions on backing up and restoring DPM.)
I must confess that I am not an early adopter of software upgrades. I like a high degree of comfort with the applications I use frequently; I want to open it and do whatever I feel like doing without needing to figure out how. And most of the time the apps are already able to do what I need personally, so cool new features aren't often compelling to me.
I'm particularly ornery about Office in that way, especially Word. To be truthful, I was quite happy with Word 97 and have grudgingly adjusted to each revision since. So I'm trying to figure out just why I'm excited about Office 12 -- eager anticipation of a new version of Word is not my usual style.
Part of it is probably due to that "grudging" relationship...I'm not so attached to the latest version. And because I'm not so attached, the amount of change we'll get in Office 12 seems more like an adventure than an upheaval. But what really really has engaged me has been Jensen Harris's An Office User Interface Blog and the story he's telling around how they designed the interface and why they made certain decisions.
Knowing the "why" makes a big difference to me. Without the "why", I'm resistant even though it's always promised to be faster ~ easier ~ more efficient ~ your reason for getting up in the morning. Jensen tells me why they're using the new UI for specific products only. He tells me why icons vary in size. He tells me why they went with design galleries.
And the more I know about what they were thinking as they designed this UI, the more interested I am in playing with it.