Welcome, Guest. Please Login
Tinderbox
  News:
IMPORTANT MESSAGE! This forum has now been replaced by a new forum at http://forum.eastgate.com and no further posting or member registration is allowed. The forum is still accessible via read-only access for reference purposes. If you wish to discuss content here, please use the new forum. N.B. - posting in the new forum requires a fresh registration in the new forum (sorry - member data can't be ported).
  HomeHelpSearchLogin  
 
Pages: 1
Send Topic Print
Knowledge Management (Read 6709 times)
Loryn
Full Member
*
Offline



Posts: 97

Knowledge Management
Jun 18th, 2008, 8:10am
 
It may have been when Mark Bernstein first published Ten Tips on Writing the Living Web. I've forgotten whether it was Dave Winer or Robert Scoble or some other blogger whose link I first followed to read Mark's piece. The wholesome morsel led me to regular servings of the eclectic panoply served by the Mark Bernstein blog.

Mark's blog has been in my feed reader since before we began calling them "feed readers."

That was four computers ago.

The first one I used was a VIC-20, almost three decades ago. We soon progressed to a Commodore 64, 128D, and through to the Amiga 1000, whose 4 MB RAM cartridge, representing weeks worth my father's salary, was the victim of a power surge when I switched the power-off-and-on a tad too rapidly.

My first computer was an old XT, purchased just six months before Microsoft's Windows 3.0 launch made it obsolescent. Throughout the 90s I owned a series of high-end Windows boxes and laptops. The power of my machines was still a point of competitive differentiation---I could work faster than my peers because I invested more, a lot more, in raw hardware specs.

Inertial momentum, and my pile of Windows software, kept me chained to the Windows operating system well into the noughties. Oh, yes, I dallied with Linux, and dreamed of freer software; but what advantage is there trading merely for a clone? Corel and Word and Excel, Delphi and Eiffel and VB, Manila and IE and FeedDemon: that was my home. But I was reading Mark.

It's hard to know what changed. It could have been that my career progression led me to more creative, open-ended issues. Or the iPod halo that began to shine on Macs a few years ago. It could have been the abortion of a product release that constituted Vista. But one thing I am certain of: It wasn't Safari or Keynote or iPhoto or iMovie or iCal or AddressBook or iTunes or even Tiger itself that first motivated me to get a Mac. They might be better---and now I think they are---but achieving better has never changed the game. It was Tinderbox's promise of mental leverage that caused me to first desire a Mac.

I wanted a machine that would run Tinderbox.

Tinderbox, the killer app.
Back to top
 
« Last Edit: Jun 18th, 2008, 8:14am by Loryn »  
  IP Logged
Loryn
Full Member
*
Offline



Posts: 97

Re: Knowledge Management
Reply #1 - Jun 18th, 2008, 8:49am
 
This year I have adopted two powerful knowledge management technologies. The first is Tinderbox. The second is the Pulse SmartPen from Livescribe.

I am hoping the information stream from the Pulse will soon be able to flow into Tinderbox.

I'm waiting on Livescribe to deliver a Mac version of their software, OCR capability, and automatic email. Tinderbox stands ready to receive the first OCRed email from Livescribe.

However, it would be even more powerful if there was a native interface between the Pulse and Tinderbox. Shapes on the page could indicate the beginning of a new note; and other shapes could indicate that the new note descends from the previous one.

A written note hierarchy being OCRed into a matching digital note hierarchy. An illustrated map view with links drawn by hand, and directly translated into a Tinderbox map view.

A pen beyond the fountain of knowledge.
Back to top
 
 
  IP Logged
Loryn
Full Member
*
Offline



Posts: 97

Re: Knowledge Management
Reply #2 - Jun 18th, 2008, 8:50am
 
I'll actually describe my KM application here within 24 hours or so.
Back to top
 
 
  IP Logged
Loryn
Full Member
*
Offline



Posts: 97

Re: Knowledge Management
Reply #3 - Aug 24th, 2008, 5:25am
 
Background

My client recently held a Knowledge Cafe. The Knowledge Cafe is designed to enable knowledge sharing across the organisation. The room was organised by table, where each table was labeled with a theme, and had its own table host (i.e. facilitator). The participants each chose a table.

Some chose a table based on what they had to contribute. Others chose tables based on identified information they wished to learn.

Information was captured in one of three ways:
1. Participants scribbled notes onto the paper table-cloths.
2. Participants filled out action-cards for topics they took to be important.
3. Table hosts captured their own notes for items they deemed to be significant.

The sessions generated over 1,000 comments, on a whole range of topics and themes. The data needed to be prepared in such a way that results on particular themes could be easily extracted by quite divergent recipients. For example, HR was interested in comments pertaining to cultural climate; Safety was interested in comments relating to incidents and near misses; and operations wanted to know about employee relations.

Method

Here is how I performed this analysis with Tinderbox.

1. The data had been typed into a Microsoft Word document. I extracted the paragraphs into Excel, added meta-data to each row (topic, data source, etc.). Opening the spreadsheet into Numbers, I dragged the entries into Tinderbox. Tinderbox mapped each entry with its meta-data into a note.

2. Using another program, I obtained a unique word list for the entire corpus. Weeding out non-lexical items, I identified the most significant words contained in the texts.

3. In Tinderbox, I coded agent queries for words significant for the domain. These queries searched directly against the text.

These are some queries dealing with safety issues.

Code:
Text(inciden)
Text(injur)
Text(breach)
Text(acciden)
Text(disrupt)
Text(near miss)
Text(near hit)
Text(report) 



4. I then coded a second layer of queries aimed at aggregating unique topics.

This is the semantic aggregation for "incident".

Code:
(#inside(inciden)|#inside(injur)|#inside(breach)|#inside(acciden)|#inside(disrupt)|#inside(near miss)|#inside(report))& !#inside(hazard) 



5. Each topic was then ranked by number of items. The ten most significant topics was then analysed relative to significant sub-topics.

For example, Incident Analysis was broken down into the sections:
  • incident alerting
  • incidents involving near miss
  • incidents and staff
  • incident feedback
  • incident other
  • injuries
  • incident issues
  • incident hotline
  • incident information
  • incident briefings and debriefings
  • incident lessons learned
  • incidents and just culture


This analysis was again implemented through agent queries. These queries were defined by set logic using unions, intersections and negations of sets at the semantic level.

7. The final type of analysis was the construction of a Thesaurus. To do this, I created a hierarchy of topics by which I could arrange the aliases of semantic-level items. My top-level hierarchy involves

  • people
  • change
  • processes, stages and mandated activity
  • information artifacts
  • physical entities
  • groups and divisions within the corporation
  • tools and technology
  • external entities


Expanding people, we have:

  • individual
  • multiple
  • states of being
  • activities and behavior


So at a glance, recipients of the analysis can browse through the Thesaurus to find categories of description used by people within this large client organisation. The Thesaurus is linked with the semantic-level items, whose children consist of items focused on selected topics.

8. The final stage in the analysis was simply producing a bunch of web-pages into which I could flow the analysis for presentation to the client's recipients. Tinderbox exported thousands of inter-linked web-pages, so each recipient can focus on understanding the data relating to their area of interest.

The client was exceptionally pleased, both with the analysis, and with the powerful tool I had developed for them in space of just five days. They now have a methodology, a mechanism containing significant re-usable, domain-specific linguistic expertise for studying further textual corpora produced by the knowledge management program.
Back to top
 
 
  IP Logged
Pages: 1
Send Topic Print