Blockchain property deeds and wealth liberation

In his under-appreciated classic “The Mystery of Capital: Why Capitalism Triumphs in the West and Fails Everywhere Else“, Peruvian economist Hernando de Soto argues that the legal structure of property and property rights is a major determinant of the economic success of a country. “Every developed nation in the world at one time went through the transformation from predominantly informal, extralegal ownership to a formal, unified legal property system [and this] allowed people everywhere to leverage property into wealth.”

Trillions of dollars of economic value are trapped in informal assets that cannot, for example, be leveraged to secure loans or otherwise bootstrap wealth creation.

Imagine if you had an inexpensive, fraud-proof way to register and regulate these assets? Could this finally be the breakthrough use of blockchain?

Police Kanban

Why don’t the police use a public Kanban board to show the progress of criminal cases through the system?

Their workrate and priorities could be assessed openly. It would be great for transparency. Victims and journalists and other interested parties could track cases without needing to call the police.

This occurred to me after reading about some dreadful case in Sweden where a child rape victim’s case had not been processed after a year, and her attackers were still roaming about in the community as they all waited for the police to investigate. Journalists were calling the police for updates. The lack of transparency combined with public ignorance about both the scale of certain crimes and polices under resourcing all contributed to the situation.

Making the police workload publicly visible could really help focus resourcing discussions.

Tesla iPhone Case

“If you can’t quite afford a Tesla, then the next best thing is to get their iPhone Case. The Tesla Design Leather iPhone Case ($45) is made from the same supple Nappa leather they use in all of their cars. They have a case to fit either the Apple iPhone 6/6s or the iPhone 6 Plus. This will make a perfect gift for any car enthusiast.”

Source: TESLA DESIGN LEATHER IPHONE CASE | Muted

Personal Wikis and Link Autosuggestion

I absolutely love wikis and have used them personally and professionally for years.

I was not surprised to learn recently that the US intelligence community uses them extensively 1. As does the UK’s GCHQ 2.

I think I started out with Wikidpad as my personal wiki before it was even open sourced. It was (and is) a phenomenal wiki. Windows native, but Python based so with some effort you can get it running on Linux and OS X too.

When I started using OS X both at work and personally, I moved my Wikidpad notes to nvAlt, another stupendous personal information manager that combined near instantaneous search with the ability to create a note right out of your search and super easy note linking with link autocompletion.

Confluence nvalt

A killer feature for me is the ability to get link suggestions/autocompletions as you type. Just Type [[ and start typing a name and if it exists you get a list of matching linked notes you can select and link to. Confluence, Wikipad, naval and SahrePoint Wiki all have this natively. You can get it in MediaWiki with plugins like LinkSuggest, but it only starts to suggest after the first three letters. This feature is missing from OneNote, although linking via [[ is supported.

Whilst I loved nvAlt for my personal wiki / notebook, I also wanted a public notebook or wiki.

I tended to find myself using one of two wikis for pubic wikis: MediaWiki or Confluence .

I had been using MediaWiki for several projects (e.g. the Belgrade Foreign Visitors Club wiki) and found it a phenomenally powerful platform, especially when you extend it with plugins like Semantic MediaWiki. I also greatly enjoyed Confluence. I used it for many years in a former company, where it was an indispensable tool for us. We used to for all our internal documentation, but also for external facing user documentation.

Confluence is hard to beat on features, especially the much loved link autocompletion feature.  It is a full on Enterprise wiki, but it comes at a price. Unlike MediaWiki, you need a dedicated VM / computer to run it. It is Java based and needs loads of memory to be performant. The license is dirt cheap for individuals and small teams ($10 for 10 users) but as soon as you exceed this you are paying big bucks for the software. You also need to be fairly technically proficient to operate a Confluence instance, but it is very well supported too.

These days I am mostly using OneNote for my notes and personal wiki. It is an absolutely superb piece of software that “just works” on every platform I uses (Windows, OSX, iOS, Windows Phone). I have filed a feature request (internally) with the OneNote team for them  to support link autocompletion. If you like the idea, please vote for it on  the OneNote team’s Uservoice.

I have been tempted to OneNote as a public wiki too. It is trivially easy to share a notebook with the public. The only problem is that the URLs are ugly and the notebook cannot be styled to look unique to you.

If I can find a way to easily shuttle my OneNotes to Confluence, I may have a winner. I can do all my composing in OneNote, then just publish to Confluence 3.

I am already considering doing this for blogging now that OneNote for Windows has a blogging feature now.

If you are looking for some resources to get started with your own wiki, here you go….

See also:

Transclusion – the inclusion of the content of a document into another document by reference. In Confluence, for example, you can mark up some text in one page and call that text into another page with a placeholder variable. This is super useful for avoiding duplication of content.

  1. “Structured analytic techniques for intelligence analysis by by Richards J. Heuer, Jr., and Randolph H. Pherson (2011)
  2. One of Edward Snowden’s leaks was a copy of the “Internal Wikipedia” used by GCHQ
  3. Plugin writers, I beseech you!

Kevin Kelly on design and the Scientific Method

[I noticed I had 36 posts in the drafts folder some dating back years. It can be quite fascinating to see what had your attention years ago. This one, last edited in March 2009, is just collection of notes for a post, but there were some gems from Kevin Kelly]

Totally engrossed in the subject of resources and pipeline management, information design, intermediate technology and dashboard design

“n-Dimentional gigantic hypercube of all the possible solutions to how to design the things and we are just wondering around trying to find the best one.” –  Stack Overflow podcast

How do committees invent?

In a discussion on Zen and The Art of Motorcycle Maintenance, Kevin Kelly made this observation:

Consider a parallel with software design:

* Statement of requirements
* [ architect/design
* [ implement/test
* deliver

That is, Scientific Method consists of a statement of the
problem, followed by a repetition of: generate hypotheses
and perform experiments to test hypotheses, followed by
From Pirsig’s description of Scientific Method:

* Statement of problem
* [ hypothesis
* [ experiment
* conclusion

a conclusion. Software design can be considered to be a
Statement of requirements, followed by a repetition of:
generate a proposed design then implement and test it;
followed by delivery of the final system.

Now, Pirsig goes into the fact that what seems like it
should be the hardest part–generating viable hypotheses–
in practice turns out to be the easiest. In fact, there’s
no end to them; the act of exploring one hypothesis brings
to mind a multitude of others. The harder you look, the
more you find. It is an open, not a closed, system.

I would suggest that this correspondence holds: that
the set of possible designs to meet the requirements is
infinite; that the act of generating a design brings to
mind multiple alternatives; that generating a design
increases, rather than decreases, the set of possible
alternative designs.

This is argument by analogy and therefore not particularly
forceful, but I feel certain, myself, that it holds. It
certainly feels right, intuitively. I think it ties in
with Goedel’s work on decidability: that any sufficiently
complex system–which any programming language is–is able
to say more than it can prove. Thus there’s always another
hypothesis that might give better answers; there’s always
another design that might solve the problem better. There’s
always room for an architect that can pull the magic out
of the clouds.

That last bit ties in to a point I’d like to expand on. That
is, that all formalisms, or design methodologies, are in
some way limiting. By adhering strictly to a particular
design process, you forego the gains that come from
inventing a new, better process.

Admittedly, you also ‘forego’ the time lost on ideas
that don’t work out.

Process or methodology is a means of getting a Ratchet Effect,
or Holding The Gains. It’s a way of applying
a pattern of development to other, related, projects.
There needs to be a way of allowing for new developments
and ideas, though.

“There’s no one more qualified to modify a system than
the last person to work on it”. That seems counter-
intuitive; one would think that the people that created
it understand it best. However, they’ve moved on to
other things, while the later maintainers got the
benefit of all the original designers’ work plus,
in addition, all that was later learned about the
system, such as how it reacts to the customers, and
how it responds to maintenance.

Software design is made up partly of flashing new insights,
and partly of routine solutions that have been invented over
and over again. Codifying patterns is a way of ratcheting
the whole community up to near the level of the leaders, at
least in terms of the routine solutions.

It’s still necessary to allow for the insights, though. A
lot of the big-company emphasis on process ignores this, assuming
that nothing is ever new, and that the answers of yesterday
are good enough for tomorrow.

(this is turning into a pretty good rant, but I think I’ll
cut it off for now)

— KevinKelley – http://clublet.com/why?ZenAndTheArtOfMotorcycleMaintenance

[Dec 2014: Sadly Clublet.com is not working, and archive.org has no archive of this page]

Digital Militias

The term digital militias is usually used to refer to online social media fighters, often paid, who agitate on behalf of their chosen cause.

Every conflict has cadres representing both sides who slug it out in forums, on twitter and Facebook .

I have another idea about digital militias. It stems from my observation that ordinary end-users do not stand much of a chance against contemporary online threat actors.

There are so many attack vectors, so many software vulnerabilities, such well resourced criminals with cleverly designed social engineering campaigns. The ordinary tech unsavy user is wide open to compromise, exploitation,  blackmail, data and identity theft.

What I see happening us that they  tend to seek out a lord of their technical domain. Someone to help and protect them. Someone to troubleshoot, clean up viruses and advise on technical matters.

Like so many professional and journeyman technologists, I find myself in this role. I am responsible for a host of computer, tablets and phones belonging to family, friends and neighbours. It goes beyond helping elderly parents with technical support. I host their websites on my server. I harden and maintain their computers and devices. I clean up the mess when they nailed by bad guys. They call me when they have a suspect a link, or need help when stumped by a technical problem.

Of course I do this all completely for free. It is a pleasure to help friends and family in this way. I almost see it as a duty. In a sense I am a one man digital militia protecting and fighting back where law enforcement is completely absent. I have often wondered where this might end up. Maybe people will start to pool resources to defend themselves online. Entire neighbourhoods who have a pooled network with a firewall and paid system administrator patrolling the virtual wall.

We’ll see. maybe the wild west days of the internet are over. The bad guys have had such an advantage for so long, one imagines there must be a corrective due. Until them, the vulnerable will huddle under the protection of the (relatively) strong but as dozens of hacked celebrities embarrassingly discovered, no one is safe.

Innocense of Denmark

"Trust is a reducer of social complexity" by Pelle Sten
“Trust is a reducer of social complexity” by Pelle Sten (Flickr)

I continue to be astounded at how vulnerable Danes and Denmark are to cyber criminals.

Everyone needs Java installed to operate NemID, the authentication mechanism that is used by every government IT system and accepted as a mechanisms for almost all large business too (e.g. banks). Admittedly NemID itself uses a decent two factor authentication system, but an entire country with Java installed.

You need to give your social security to everyone. Video rental stores, dentists, language schools, they all need it. As a consequence there have been some mega breaches of the database. There is no current way to change it either, so it is likely that bad actors have CPR numbers for most adult Danes.

Everyone is in the phone directory by default. People put their names on the their doors and post boxes. It is identity theft paradise and Danish online retailers pay a heavy price for it.

All these systems are interconnected too. I was amazed when I got my passport picture taken at a local studio and they informed me that part of the price was automatic submission of the photos to the passport bureau.

Wonderful convenience. Just what IT should be. But wide open to exploitation?