Categories
Cool

The Encoding

“Last Prefect Qa’a, we believe a solution is at hand,” breathed Imuthes.

A sigh of either resignation or relief from Qa’a and then “We have time, but now would be a fine moment for a solution. For so long our solutions have been about survival, this last one,” a pause, “our rebirth.”

There were people still alive that remembered the light of Tabby falling on Hoth. Today, the Array transformed gamma rays from Sekhmet, a stellar mass black hole, to bathe the planet with visibility, yet strikingly different light from what they remember from Tabby. Qa’a was the one that demanded the addition of Last to his honorific. Many others had followed suit and added Last to their names. He mused it really wasn’t necessary, as living on Hoth at this moment meant you always knew you were the last. Last man of the family Narmer. Last of the Engineers of the Array. Last lover of Neith.

Imuthes floated alongside his beloved Prefect. Old beyond any body, the Great Engineer of the Encoding was mostly augmented except for his brain. That brain had taken on the challenges of the Last Save and the Recovery. One failed, and one, so far, had not. As they moved, Imuthes reflected on the loss of Tabby. Sekhmet appeared seemingly out of nowhere. The rogue black hole was on the perfect trajectory to eject Tabby and capture Hoth. When they first discovered Sekhmet, it did not take long to discover that Tabby would be ejected. All of the people of Hoth were notified at the same time they were told of the plan for the Last Save.

The Last Save was the finale of a series of attempts to keep Hoth in orbit around Tabby. The energy needed was so great, Imuthes despaired many times. Finally, he came up with the Array. He looked up to see the rapidly fading star, knowing that in just a few years, the increasing distance would make Tabby invisible to the unaided eye. The purpose of the array was to provide Hoth with the energy needed to maintain its orbit around Tabby as it was ejected from the local cluster. After years of construction and innumerable calculations, Imuthes realized that no array could keep Hoth with Tabby. The array could; however, keep Hoth in a thousand year decaying orbit around the despised Sekhmet.

If it were not for the Recovery, Hoth would be rubble slowly winding into the maw of Sekhmet. The Array was the great invention of his people. It saved the plant from annihilation. As the dark interloper Sekhmet drew near, Imuthes used the Array to provide the delta in Hoth’s velocity to match the evil black hole’s orbit requirements. With their orbit stabilized, the Array was repurposed to convert Sekhmet’s X-rays into all manner of energies for Hoth. Now, however, saving Hoth was out of the question. What they needed was a plan for rebirth of their civilization. The Encoding.

The Prefect asked the current rigor of testing for the solution. “We are in the 314th iteration of the simulation, my Lord Prefect,” stated Imuthes. “The last 100 iterations have used some new concepts I would like to discuss now. There is great care to ensure that the Encoding explores our question, can we be rescued from inside the back hole, while at the same time, at some point, understanding its purpose. The bias to ensure that upon understanding its purpose, the iteration still explores our question is what has stymied us until now.

Just imagine, a society learns that all they are is a simulation instantiated by a dying people to rescue them from a black hole. Why would they continue their discovery, their science? What tool can we insert into the Encoding final iteration to bias the understanding fork to take the still true path?”


Mark was in trouble. His last paper had sent Professor Tegmax into one of his spittle rants. As a poor footballer in his youth, Mark had watched a small bit of spittle slowly freeze on his coach’s nose during a similar rant. Now, as Professor Tegmax emphasized the truism of science not being funny, fragments of spittle launched from the event horizon of his lips. Less a spray than a ejecta, the bits-o-spit mostly followed the same trajectory toward Mark with some wildly arcing off in the direction of the great window spittle attractor.

Mark was not listening. His mind wandered to the paper he wrote in college. The creative writing assignment was one of metamorphosis. Kafka had been consumed, and now it was Mark’s turn at writing a classic. The metamorphosis of food to feces was probably never anticipated by the professor, let alone Kafka, but the line including “individual feces pieces” was a hit in class.

“Mark, Mark, you are jeopardizing your credibility, not to mention your career,” moistly aired Professor Tegmax. “I was willing to support your research into mathematics as reality, but this ludicrous language [the professor doing the silly air quote finger thing] is Einstein’s observer is just, well stupid. Occam would scream. Reality arises from language?”

“Sir, Kaufman, Strogatz, and Lereto’s work in adjacent realities was the key. They showed how true innovation was equal to personal discovery. The mathematics of a person discovering a new song was equivalent to Curie’s discovery of radiation. If as it appears increasingly likely, that there is another external physical reality, then how does our reality present itself? How do we discover [Mark purposely doing the same air quote finger thing] reality?

“But Mark, language is what locks in reality? Please, sir.”

“Whatever is in my head, not communicated, is real only to me. As I hold on to some random thought, you and the rest of our reality will react as I either communicate or do not. When I elect to communicate, my words modify all adjacent realities. Some of those language constructs continue to propagate and become a new reality. We are individual observers that as a collective communicate the state of reality for our agreed to universe. We are the building blocks of reality and language is what locks in our discovery of reality. At one time, the bulk of humanity considered the Sun a god, the world flat, and flight impossible. Once people are able to describe a new reality that permanently modifies enough adjacent realities, then the new reality is discovered. Language is reality. And don’t forget, we now know that language is based in math.”

“Until you come up with a mathematical model that explains this, I loathe to call it a theory, OK theory, then your ideas, to use the German; es ist nicht einmal falsch!

Professor Tegmax’s use of German caused a huge increase in spittle. “That is why I am here, Professor Tegmax, I would like the university to fund my research. Pikovski just found pixelation. Our entire reality is encoded in the event horizon of some black hole. Why do we live on a black hole? Why is language the discovery tool for reality? You must support my research, Professor!”

“Get out, you and your crackpot ideas. Pixelation, indeed. Until the entire scientific community can understand and agree with Pikovski’s research, pixelation and the Holographic principle are not reality!”

“That Professor, I completely agree with,” Mark murmured as he left Professor Tegmax’s office for the last time.


“So Imuthes, what is different with this iteration of the encoding?

“Prefect, we placed the simulation in our future, not our present or past. I did, though, add much of our history into the iteration. The encoding of the simulation is holographic and the learning method is our language. This simulation is intended to tell us what it is like living on an event horizon. Its goal is to ascertain if there is a way to leave the prison of the event horizon. Can our frozen reality leak away from Sekhmet? In the 314th iteration, the simulation has just discovered the holographic reality. Even though it has discovered that it is trapped on the outer edge of a black hole, it appears that it is continuing discovery. The simulation science continues!”

“What was your bias Imuthes? What did you do this time to keep the Encoding alive?”

“Qa’a, we encoded a bias for hope.”

“Ah, my dear Imuthes, you give hope back to me on this day. The array gives us light and saved our world from being torn apart. Yet, it will not save us from the same fate as our simulation. So be it, launch the simulation to the Lagrange point. Someday, may a great civilization discover our simulation still at work looking for or having found a reality for leaving the event horizon.”

Categories
Uncategorized

Social Gravity

Six degrees of separation, or on Facebook 3.57. Why are we attracted to Kevin Bacon? Why are there some people with a bunch of friends, while others of us have few? Why do social networks look like brain neurons or street maps or the cosmic microwave background radiation? Why does preferential attachment in scale free networks act like gravity? Gravity? What if? Let’s do a little back of the envelope exploration of the gravity in our social networks.

On Facebook, it now takes only 4.57 hops to get from you to any other person. Instead of Kevin Bacon, let’s call her That Girl. So, my connection to her would go from me, to you, to your friend, to their friend, and then to That Girl. You count me as a hop, but the degrees of separation is 3.57 (there are three people between me and That Girl). We will assume that the half person is accounted for by your friend, as he has an extra large beard, and his friend, as she is tall and plays volleyball. Me, to you, to Beard, to Stretch, to That Girl.

DistanceIf you divided the hops into equidistant units, you would get 1.1425 units per hop. The type of unit doesn’t matter, so let’s just call them iluminaries. Next, we can bring gravity into our discussion by looking at masses and acceleration and all of that, or we can look at effects. At the micro scale we are talking about (human interactions), gravity is impossible to measure. But remember, gravity may get weaker, but gravity never quits. Gravity also knows no bounds. It has been working on everything, and I mean everything for all but the first second of our 13.8 billion year old universe. Every piece of matter and every micro-joule of energy has at least one, but more than likely trillions of gravitational signatures embedded. So, let’s keep this simple and say in this case all gravity in our social network is of the same force. Let’s look at what happens to gravity’s effect as we take our hops to That Girl.

So what is the gravitational attraction between me and you, or any one hop? Well, gravity gets weaker by the inverse of the distance squared. Thus, for traveling just over one iluminari, we could say that our generic everything is equal gravity is now weaker than 0.766104, oh, uh, we need another unit, OK, let’s call them hugs (1/1.1425^2=0.766104). So on our first hop of 1.1425 iluminaries, gravity had a force of something or other (another new unit), but by the time I get to you, it is now 0.766104 hugs of it’s former force. Between me and That Girl, gravity is a shadow force at 0.047881 hugs of something or other. Let’s get rid of something or other by dividing everything by one something or other, and we end up with our hugs as the “force of gravity” at each hop. This is called normalization.

HugsLet’s see what the loss of force does to our social network and how I am connected to That Girl. The average Facebook user has 338 friends. Some have more, some have less, but the person in the middle of Facebook has 338 friends. We could play around with the more friends the more gravity, but that is another post. I’ll just post myself at the center of Facebook with all of my 338 friends. If just gravity accounted for how connections are winnowed down to That Girl, we would expect at each hop, some number of connections would drop off and we would get to some non zero result at the end, hopefully near one.

Let’s explore that last statement just a tad more. While in reality the path to get to That Girl will be Me, You, Beard, Stretch, That Girl, yet there could be more than one path. So my 338 connections have some number of common connections with your connections, and you have some number of common connections with Beard’s connections, and so on. In the end, however, many connections are ruled out as we hop our way to That Girl. That what binds me to That Girl goes through each of our connection repositories (me, you, Beard, and Stretch). At our four hops, I still have 0.047881 hugs binding me to That Girl. So, let’s just use gravity to reduce the number of possibilities to get to That Girl.

If I have 338 friends when we start, the weakening of gravity will reduce the number of common connections with you to 259 (338*0.766104). At Beard, we are down to 50 (259*0.19153), and at Stretch we are at 4 (50*0.08512). When we get to That Girl, we are at the non-zero number of 0.2 (4*0.04788). Nifty. Our use of gravity seems to agree with the Kevin Bacon algorithm. However, it would be a stronger story if we ended up closer to one, as well, That Girl is a one. We fudged the 4.57 “distance” with Beard and Stretch, so it seems a stronger argument if we get closer to one.

Well, if this is about gravity, density matters. If you are a certain type of nerd you chuckled just then. Crazy thing, in network analysis today, we use a term called density to describe the total number of connections that could exist versus what actually does. Let’s say the planet Jupiter could have 100 moons, but the density of planets, moons, comets, dwarf planets, asteroids and other planetary material in our solar system limits it to 67 moons or a system density of .67. For Facebook, density is .12. Density is calculated by the actual number of connections divided by the total number of potential connections. Remember, Jupiter due to its size could have many more moons, it just hadn’t had the opportunity to capture those other 33. So, even though I only have 338 connections, I could have as many as 6834!

In this scenario, our 338 connections would spawn 6834 potential connections at you (338*((338-1)/2)*.12). From there, we have 4008 at Beard, 145 at Stretch and .81 at That Girl! So much closer to one.

DensityWe have a distance of 4.57 iluminaries. My gravitational attraction has captured 338 friends into my orbit, but because of the mass of my friends coupled with the potential density of friends around me, I am pretty assured of being connected to anyone on Facebook that comes within 4.57 iluminaries from me. Once I get to 375 friends, I am a lock to capture anyone on Facebook into my gravity well.

Why did the numbers come out so close using gravity parameters? Actually, I was somewhat surprised. Remember, I also took a bunch of liberties. I used how the force of gravity decreases as a stand in for the actual force of gravity. I also used network density to the advantage of my argument, yet this is all pretty well within the bounds of reasonable. In both of our cases, we had non zero/non negative numbers less than one. If you build a spreadsheet with these numbers, you can see how having just 375 friends gets you to one. Big data is giving us the tools to look at patterns. I argue many of our patterns are the result of gravity’s universal reach that never rests.

Categories
Big Data Business Management Business Techniques Cool Data Enterprise Collaboration Innovative Techonology Virtualized Consulting

The Democratization of Work – A Cooperative Exchange for Work

For a generation, the American worker has been under siege. Allison Pugh, Sociology Professor at the University of Virginia, writes about how today’s employment is a “one-way honor system” that ties workers to employers who have little responsibility to employees. Yet, from globalization’s competitive pressures stymying wages, to business’s relentless pursuit of productivity gains, to technology’s constant disruptive influences, the American worker has weathered it all. However, at this moment in time, the American worker stands at a fork with two clear choices for the way forward. Stay the path and let corporate lead, or band together in a new unionism to democratize work. I suggest the way forward is a cooperative work exchange, powered by block chain, defined and owned by the American worker.

With unions no longer looked to for protection, business pensions an anachronism where they still exist, and technology rapidly changing the work relationship from the top down, the American worker must realize that the status quo is both undesirable and unsustainable. The gig economy as it is currently being implemented further erodes the American worker’s pay, rights, and status. Yet, the gig economy could promise something so much more powerful. The gig economy could usher in new freedom and power to the American worker, while at the same time giving business one of the greatest advances in productivity since the advent of the assembly line. The gig economy instantiated as an exchange for work may some day rival the power of the world’s great exchanges for equity.

This exchange for the offering, contracting, managing and paying of work is envisioned to be powered by block chain technology (something like Ethereum). While work will always be identified at the point of need, the terms and conditions for the conduct of that work will now flow from the bottom up (gig worker to employer). To incentivize corporations in this new paradigm, businesses will have the ability to access the entire block chain ledger for analysis and Google-style algorithms for work forecasting and modeling. The Exchange’s mission will be to serve the American worker while enhancing the businesses that employ her by leveraging the power of the crowd. Here’s how it could work.

Technology

Recently, the Ethereum Project forked the technology underlying Bitcoin to create a generalized platform for building decentralized applications. Businesses and entrepreneurs quickly took note, with IBM even outlining an Internet of Things management concept using Ethereum. One of the Ethereum Project’s major enhancements was the inclusion of what they call contracts, which is basically a framework for including software logic with each transaction entity.

Thus, in the work exchange, the basic transaction entity would be a piece of discreet work. Discreet as in a task that is part of a user story, waterfall schedule, or even a real time help desk ticket. The work would have metadata outlining the skills needed, timelines, remuneration, and taxonomic elements (epic, program, department, company, etc.). Identity would be encrypted until necessary for consummating a transaction. This would protect businesses from corporate espionage and worker true identity from the negative side of metadata analytics. Everything else would be transparent to all on the work block chain. There are arguments for certain metadata to also be encrypted and maybe even some of the software logic as well, but these are topics that the exchange would decide according to its governance charter.

Whereas the mining concept in bitcoin is the main trust mechanism, algorithm and software contributions, research studies, and other exchange enhancing activities will be the sources of trust and user juice in the Exchange. It’s one thing to be great at a certain type of work or need a ton of a work from the Exchange, it’s another to enhance work for all.

I’m always amazed at how a simple idea rendered in software can become worth $1B plus. Uber, the harbinger of the gig economy that sometimes bears its name, is GPS, a map, a few algorithms, and a software form. With the Exchange, a similar simple worker interface is envisioned – a task or to do list. There is certainly a need for a vitae, yet all the user really needs is list of their current tasks, and a search capability to find new work (gigs) to bid on and the mechanism to so bid.

On the business side, current project management software, using REST/JSON integrations with the Exchange, would pretty much function as they do now for work under their purview. However, the massive data store encapsulated in the Ethereum block chain ledger poses as a litany of opportunities for business performance improvement.

Contracts

Leveraging the contract mechanism in Ethereum to bundle user defined work contracts is one of the key concepts of this proposal. While Uber gives drivers massively improved efficiency, it also places them in this new and mostly unregulated world of the gig worker. It is clear that business is moving to a post employee era, where all work is contracted on demand. However, the current gig economy has no mechanism or support for contracts that shield the worker. As essentially a company of one, all of the employment laws cease to cover this new business entity. Further, the bulk of case law is premised on the company-employee or company-company relationships, not this new hybrid, company-gig worker relationship.

So, new rules. Why can’t the exchange limit contracts to those defined by the gig workers? As contracts gain favor by workers, and more workers join the Exchange, the contracts become powerful actors in the company-gig worker relationship. For example, work for hire, favored by companies, is not in the best interest of the creative knowledge worker. In the software engineering world, open source (the opposite of work for hire) is rapidly becoming the preferred contract arrangement. Gig worker defined contracts, with enough workers that require them, can accelerate this evolution.

Most contract language today is a series of clauses that have a few variables that lawyers love to re-code. Regardless of the encoding, the clauses end up with the same few variables. Further, certain clauses cannot have certain variables in the presence of other certain clauses. Some clauses even respond to changes in state! Sounds like something a computer is good at doing. Luckily, the legal system is finally undergoing a computing transformation. For example, Stanford University’s law school has a large legal Informatics program that includes work into computable contracts.

One of the reasons the Exchange is proposed to be a non-profit entity owned by the gig workers is in direct recognition of business motivation. What globalization has taught business is that the employee as a resource is not a euphemism, it is reality. Resources in the warehouse is synonymous with employees on the bench. Training is a resource cost. Turnover happens and it is a cost. Friction is to be avoided. These are not behaviors to be saddened by, they are the behaviors that power our modern world and are unstoppable. We should not mourn that business is not motivated by what is good for the worker, we should recognize such, and put in place mechanisms that do consider the worker.

This new form of collective bargaining should appeal to all sides of the political spectrum. It does not need new law making, as it leverages what is already in place. It is not anti-business, as it recognizes what motivates business and agrees. This new form of collective bargaining actually gives business unprecedented forecasting capabilities. This new form of collective bargaining is actually about freedom for the individual via the power of democracy.

Analytics

Since the Kevin Bacon game popularized what is now known as scale free networks, graph theory has taken center stage of math as an economic value driver. The science has rapidly gone from predictions of nodes joining a network, to actually positing gravity driven information theory. It is hard to argue with the power of network analysis. Google is one of the biggest and most successful companies ever to trade an equity, and they owe it all to the power of network analysis via graph math. Web links describe networks. Search terms by person or by concept describe networks. Each of these networks have metadata that further describe them.

If you have millions, and soon billions of work items traded on a 100% transparent exchange, with a plethora of metadata defining taxonomies of every kind imaginable, what kind of networks and their attendant insights might one expect?

At some point, common work terms by taxonomy will become apparent. I call them lexemes, but you can think of them as types of work. With enough samples, we should start to see common sequencing of these work items. Over time, we will certainly get a good handle on effort estimates. Using graph math, we can create new algorithms. These algorithms can start to make project performance better. They can forecast, they can predict.

Thus, the Work Exchange is not only for the worker, it is for business. A massive data store of work can enable us to take on the great projects that stand as dreams today. The massive data store of work can provide detailed plans of unparalleled fidelity. The massive data store of work can forecast risks and trends with high confidence. The massive data store of work will surprise us with insights we cannot see today. Business will finally see through the fog of tomorrow and strategy will become truly executable.

Exchange Charter

While not an exchange, yet, Linkedin at a $25B market cap shows how valuable a work exchange could become. However, if Linkedin is the exchange, the exchange would be beholding to investors, and then in the end, the gig workers become the product, so rinse repeat, and workers as resources again in short order.

Linkedin, Google, Facebook, et. al, all have the same model. The creators and their data is the product, and advertisers are the customers. If the Exchange is a traditional business, who is the customer and who is the product?

It makes sense to create an entity that is chartered by the gig worker members, charges a micro fee for each transaction, and then gives some portion of profit back to the workers. The product is the work exchanged, as well as its metadata. Businesses are the customers. Making the gig workers the owners, changes their motivation as well. The gig workers will want the massive data store of work to always increase in value. The gig worker owners will want the best contracts for their members that can actually solicit work from business! The gig workers can only change the contract paradigm as far as business can comfortably adjust.

As is readily apparent in understanding this exchange for work, the size of the crowd is pretty key. Both power of negotiation and power of analytics are directly proportional to the numbers of workers and businesses participating. That is one reason that I think that some of the old tactics of unions will not find their way to success in the Work Exchange. A strike is bad for both sides, as the value of the exchange is driven by the number of successfully contracted work items. Work slowdowns, same thing. Outlandish contract claims as a method of negotiating, it might be tried, but the more logical would point out the weakening effect of prolonged negotiating. While these tactics may occur, they will not be effective in this new model, as the workers themselves own what is the most valuable. In old unionism, the power stemmed from the collective ability to stop working, hurting the employer. In the new unionism, workers will coerce business into friendlier contract terms by providing increasing value, not value denial actions.

Whatever the charter, the Exchange must be ruled by its members and not investors.

Conclusion

Business has already started the move to a gig economy, shedding the costs of the employee with the reduced costs of either a vendor or a temporary worker. It is clear that the full time employee is on the road to extinction. So what is the American worker to do? Stand pat, or begin a dialog for a way forward? We think a summit to discuss this Work Exchange concept, even if it does not result in a decision to move forward, will begin to illuminate a positive future for the American worker in the gig economy.

Categories
Benchmarking Big Data Business Forecasting Data

The Semantic Approach to Building Accurate Project Plans

An idea.  Alone and unshared, while of great potential, worthless in the wild.  In the consumer space, ideas are tweeted, trending, memes, most popular and can even go viral.  The 140 characters of Twitter and the brief updates on Facebook, singular or compound ideas, are being used to target marketing.  In the consumer space, ideas have value.  It costs over $54 dollars (Google Adwords) to tap into the buy insurance idea.  So what about in the business enterprise?

For over 60 years with the advent of modern project management, ideas happen once, and then the work begins.  No trending, most popular, nothing like a business segment meme, or even the winning idea going viral.  We keep developing a project plan like it is a one off exercise with no recognition that the past shapes a new project’s prospects for success.  The project – one off things we do very poorly (less than 50% full success rates).  Sounding more and more like the definition of crazy.

Last year, I completed an in-depth study of ideas in the workplace.  I consciously decided not to use the idea word, as it sounds so, well, fluffy.  I use lexeme (a basic unit of meaning), a word borrowed from linguistics.  I also use lexemetry to describe the process, measuring lexemes in a context.  For me, that context is projects.

Here’s what I did.  First, I had already developed an engine that does semantic matching.  While it can look at any string, I focused on task titles for this research.  For each task, the matching engine looks for other task titles that semantically match the new task title.  New matches are added to a matchset cache.  Now, we have a subset of all tasks that match semantically.

500px-Semantic_Net The premise is the tasks that match, those that happen more than once in a project portfolio, form the cogs of what makes the day to day project assembly line operate.  These matchsets are the iterating steps of a formal process. They are multiple copies of project templates.  The matchsets are also things that everyone adds to projects based on the culture of the enterprise.  They are the crowd sourced knowledge of the way to get projects done.  Also, iterating tasks allow us to build models based on the matchsets – cost, roles, durations, efforts, movements, relationships, all the metadata of project work.  In fact, I’ve argued that the bulk of our enterprise big data is meta of work/performance.  Thus, these matchsets are actually the Higgs Bosons of enterprise big data – everything else is created from work.  As such, the matchsets can also provide insights into those tasks that don’t match, the value givers.  With models we have benchmarks.  With models we have ranges.  With models, we can even evaluate worst, bad, abnormal, normal, good and best.

While we could look at any other meta of the matchsets, I wanted to do something with time.  Yes, flexing our analytical muscles.  I wanted to show that these matchsets are not just static ideas/lexemes glued to the projects where they occurred.  They move, they materialize in different parts of a schedule.

However, time comparisons are not a simple endeavor.  Projects have different durations, and June 15th is comparable to what?  We need to normalize time across projects.  We chose percentage as our canonical form for time.  Thus, the beginning of the project becomes the 0 percentile of time, while the 100 percentile becomes the end of the project.  Now, all time is directly comparable, and all of our matchsets live in this normalized time.

We have some cogs of commerce iterating in time, so what can we tell from this?  We can find benchmarks, mine process, and even identify best practices.  But first, our data.  I randomly selected data from a few databases. I also ensured that I did not get all similar businesses or industry types.  That resulted in just shy of one million tasks in over 20,000 projects.  I found:

  1. At least 40% of tasks are iterative.  We have a benchmark.  I’d also like to say that the concept that all projects are one offs is a dead one.  Almost one out of two lexemes in your project is iterative.  Plan on it.  Use it to your advantage.  Understand your iterating ideas via measure.
  2. On average, lexemes can deviate in time by 11.52%.  We have a benchmark for a task buffer.  The critical chain folks are going nuts!  To me, this so fundamental.  We have a statistically significant finding of an empirical number that shows how difficult it is to schedule and perform work.  Any piece of work can move by around 10% of the project timeline!  Conversely, we can use this bedrock datum to better plan.  Agile, your stories now have a time scale.
  3. Look at this composite process I mined out:
Matchset Master Title Start Percentile
Develop Project Charter 4
Project Initiation Activities 5
Develop Preliminary Plan 6
Update Charter 10
Complete Charter 12
Provide Detailed Plan 13
Develop Communications Plan 23
Identify Supply Chain 27
Develop Requirements 30
Execution Phase Start 42
Development Complete 43
Deployment 69
Execution Complete 70
Deploy to Production 86
Training 86

At a very large company, I found the commonalities in the plethora of templates across many business units. Additionally, I found over 80 lexeme matchsets (in proper time sequence and proper time distance) that were being consistently added to these project plan templates – sub project assembly lines.  Lastly, their very traditional waterfall approach resulted in actual project work consistently not starting until almost half way into the project!

I believe that knowing the benchmark iteration number (40%), as well as the benchmark time deviation number (11%), project success will increase for any enterprise.  It is time the enterprise got its meme on, as lexemetry is poised to go viral.

Categories
Business Optimization Project Management

Project Managers, Our Next Action Heroes

In the movie 2012, intense solar storms force mankind to abandon Earth.  Project management certainly played a key role in building the arks needed to evacuate mankind.  Here in the real world, we actually do face a serious threat from solar storms, and project management is our best bet to overcome the devastating effects of the next big storm.
A very active region on the sun, AR2192, is currently Earth facing and throwing off x-class solar flares.  So far, none of these have produced a coronal mass ejection (CME).  Maybe tomorrow, but certainly someday, the Solar Influences Data Analysis Center (SIDC) at the Royal Observatory of Belgium will issue an alert that will call on us to be project management optimum.

Forty-two hours after the alert, the time it would take for the charged particles to reach Earth, our technological society will regress to 1859, the last time an event of this magnitude transpired. In 1859, a series of CME’s known as the Carrington Event fried the fledgling telegraph system.  The 1859 Internet.  Auroras were seen as far south as Cuba!

Forty-two hours after the SIDC alert, we will find out how good we are at project management.  We will find that, at best, way over half of everything we do will either fail completely, take much longer than we anticipated, or cost more, taking away resources from other critical priorities.  Based on forensics of past CME’s and the rate of CME’s observed by satellites for the last 30 years, a Carrington-level event has a 12% likelihood of occurring in any ten year period.  With those odds, a Carrington-level event is 30% likely before the end of this century, and virtually assured by 2200.

I’m not crying wolf, we just missed a Carrington-level event by one week on July 23rd, 2012.

Forty-two hours notice.  With all of the wonderful opportunities we as a species have in front of us, along with the great challenges that we face, project management is becoming the common enabler.  Of any on171879main_LimbFlareJan12_lge thing that mankind can do to decrease the greatest amount of risk is to make projects predictable.  I call it project management optimum.  While we will be devastated by a Carrington-level CME event, we can recover faster, saving countless lives, if we are project management optimum.  The only way for SpaceX to get to Mars, within any reasonable expectation of funding, is to be project management optimum.  Of note, a CME direct hit is just the most clear and present danger.  I would think that any engaged person could come up with several grave dangers and wondrous opportunities that project management optimum could address.

We will have seen our demise coming for forty-two long hours.  The US General Services Agency (GSA) recently conducted a challenge for describing what project management would look like in 2039.  What is needed is a bigger challenge, an actionable challenge.  What is needed is an x-Prize like completion to consistently forecast cost, schedule, and risk; the pillars of project management, at the 95th percentile of accuracy or above.  We have the data to do this today, we just do not have the algorithms.  The research is out there, just not system engineered into a solution set.  Google “$100 million investment,” and you will see nations, states and even cities and companies investing funds of this size.  For that amount, GSA could set up the investments and prizes to get us to the 95th percentile.

A 30% chance in the next 86 years and only 42 hours’ notice.  It’s time to make project management optimum a national priority.

Categories
Government IT

A Vision for a New Era of Project Management – Bootstrapping a Solar System Civilization

The White House Office of Science and Technology Policy asked for entrepreneurs and visionaries to submit their ideas for bootstrapping a solar system civilization.  OK, here you go.

The Inefficiency Tax – Huge Friction At Any Mass

Regardless of how we attempt our immigration to other worlds, bootstrap or pay today’s estimated full price, an inefficiency tax will be levied.  In a 2012 study, McKinsey & Company found “On average, large IT projects run 45 percent over budget and 7 percent over time, while delivering 56 percent less value than predicted.”  It is very likely that the average inefficiency tax is more like 50 percent, as IT projects are much more mature than most corporate projects.

However, we do not have to pay this tax.  We can overcome this tax.  We can enjoy refunds from an ever increasing efficiency.

The greatest payoff, one that will impact every facet of the civilization of space, is to become project management optimum.  Projects are the delivery mechanism, the new assembly line, for anything of real value.  In fact, Tom Peters has said, “All work of economic value is project work.”  Yet, due to our inefficiency in planning and executing project work, everything we do costs more, takes longer, and delivers less than we planned.  If we have any hope of becoming solar system migratory, we must become project management optimum.  Project management optimum delivers at six sigma: on time, within ±5 percent of cost, and with more scope/stories than originally planned.  Project management optimum will stop the levying of an inefficiency tax.

Many trends are moving us toward project management optimum.  Overall, project management maturity has increased, with the importance of the project now seen in all areas of the corporate world.  Marketing gets that it too is project driven.  The CFO has become a believer.  GSA even conducted a vision for the future on Challenge.gov for public sector program management.  However, the most impactful trend is the over hyped, yet powerful reality of big data.  Big data – the variable resolver.

That’s what projects do, they resolve variables.  At their best, projects anticipate variables.  At their optimum, projects identify and help communicate the choices of the variables while recommending a best course of action.  As projects become more complex, variables increase on the exponential slope.  While we humans are great at spoting patterns and trends, project management artifacts do not lend themsleves to easy pattern or trend recognition.  We need to rethink the summation of project artifacts in project management optimum.

Thus, the call for mining lexemes in my Challenge.gov winning paper.  Lexeme is defined as a basic lexical unit of a lexicon, consisting of one word or several words, considered as an abstract unit, and applied to a family of words related by form or meaning.  Semantic similarity.

Amazon buying recommendations, Google search, Facebook friend do-you-knows and NSA phone call metadata analysis have demonstrated that graph theory and small world algorithms are very accurate forecasters.  To get to project management optimum, we must build context specific graphs of lexemes at work in projects.  We must build these models pan-project, pan-program, and pan-nation/language.  We must define protocols for sharing lexemes and their metadata.  We must discover new insights into language and the short term evolution of language to identify trending lexemes that are nearing meme-birth.  Memes, when viral, are the project’s best friend, yet many times, its worst enemy.  Best to know which.

I believe like Carl Sagan that, “Exploration is in our nature.  We began as wanderers, and we are wanderers still.  We have lingered long enough on the shores of the cosmic ocean.  We are ready at last to set sail for the stars.”  It’s a long trip, though.  Let’s start our journey with project management optimum.  Of all the technologies that we will have to enhance or invent, becoming project management optimum will be by far the cheapest.  However, it may well have the most impact.

Categories
Program Management

Program Management Challenge Winning Submission

GSA via Challenge.gov asked the public for a vision of public sector program management in 25 years Well, what do you know my entry was recognized as the Best Overall in the competition, judged by leaders in the public sector industry.

The Lexeme Way to Mars

In 2020, SpaceX announced it would send a team to Mars.  In his speech, the SpaceX CEO declared that aerospace technology would not be the deciding factor of success.  He said, “It is the logistics of this endeavor.  Knowing with a high degree of confidence what risks to worry about, and then to spend only the minimum amount of money to address.  Projecting out with great accuracy who would be needed where, when, and if necessary, how many. Sequencing as efficiently as possible the plethora of tasks to accomplish.  We believe in the lexeme approach, and will rely on a measured lexicon for success.”

His soaring rhetoric capped an amazing few years of discovery.  It was not a smooth road all the way to that podium.

It started with big data.  So much hype, so little results.  The Internet of Things, ended up being the Internet of Overload.  The rate of change was so great, yet the rate of taming enterprise performance kept lagging.  Economic growth stagnated as productivity improvement fell to zero.  Project failure rates stayed so high that dreams of journeying to Mars began to seem nothing more.  All of the great challenges mankind faced morphed from righteous endeavors of assured victory to what seemed epic struggles to indeterminate ends.

And then, as always, some simple ideas emerged to light the way forward.  Of course it had to be ideas, as over 80% of the value of American business was then based on the intangible.  Today, in 2039 it is estimated that 95% of all economic value is based on intangibles.  Thus, at the dawn of the 21st century, business realized that the project was the new assembly line.  Projects could moves ideas to market.  Projects could enhance the value of tangible goods. Projects could herd disjointed ideas into something of value.

Software and algorithms were made in great numbers to address the project as the value driver of the economy.  In 2014, there were over 200 project related software products.  However, improvements were small.  Failure rates remained too high. But then, using the assembly line metaphor, researchers suggested a quest to find the new cogs driving process and projects. If the cogs could be found, and then measured, performance nearing the manufacturing holy grail of six sigma could be attained.  A visit to Mars might seem real again.  Victory over epic challenges could be a forecast.

The key idea was the realization that lexemes, basic units of meaning, could become the cogs of commerce.  With algorithms similar to Amazon buying recommendations and Google search, lexemes could be semantically matched as they were digitally produced in project plans, work flows, action items, to dos, risk registers, and anywhere else digital project artifacts existed.  Once the software folks had collections of like things, they could begin to measure them.  Now the promise of big data and the Internet of Things could be realized.  If you had the cogs of performance, everything else was metadata – measures!

In 2016, GSA sponsored an XPRIZE-styled competition for the first company to produce the most accurate project schedule using big data.  Judging would use a completed project schedule as the rubric, thus the winner had to not only identify what work had to be accomplished, the solution had to also sequence and size the work.  The results were astounding.  Lexemes could be mined.  Lexemes could be measured.  Lexemes could be algorithmically manipulated.

640px-SpaceX_KSC_LC-39A_hangar_progress,_June_2015_(18039170043)

At the moment of SpaceX’s proclamation supporting lexeme based management, the birth of the modern way of managing projects had arrived.  In 2022, the first tool to mine lexemes from requirements statements was developed, successfully solving one of the touchstone problems of project management – scope management.  Linking lexemes from requirements statements to work products to schedule items to test cases was accomplished by software.  Requirements traceability was solved.  It did not matter if the organization iterated, waterfalled, or even furballed, the lexemes, their performance trails, and their measures provided clarity and projections.

The merger of Twitter and IBM in 2023 heralded in the next great capability.  Lexeme trending and Watson learning allowed people to leave their desks and venture…wherever.  A message to their Watson instance and a project manager had 3D bubble charts and the status of lexemes on the critical path sent to their iPhone 15.  Later in 2024, Google opened access to its full map universe, allowing lexeme location at the office level.  Where were people working on ideas that were in trouble?  If presence was required, measured lexemes allowed identification of the person with the most experience of that particular lexeme nearby.  Linkedin added a section and integration points with software tools to provide a professional’s lexeme statistics in the skills area of user profiles that same year.

In 2030, Princeton University finally finished updating the WordNet Lexical Database to an unabridged lexemes database with measures.  In 2032, Princeton finished all language translations of WordNet.  SpaceX was a major contributor to the WordNet endeavor.  At the press event, the SpaceX CEO declared, “Today, my supply chain is finished. Next stop, Mars.”  When asked by a reporter what he meant by supply chain, the CEO stated, “Today, I can see how ideas are moving through the Mars program.  It is the ideas that are my raw materials.  It is the measuring via taxonomy of ideas that are my products.  It is the statistical sequencing of ideas of how I get to market.  With the new WordNet, language, the final friction to my idea supply chain is now gone.  How lexemes are used in a particular endeavor or program is the only language that matters.”

So much has been written about the Great Save of 2035 that the only point to be made in this paper is how the Jet Propulsion Laboratory’s Watson instance uncovered the lexeme trend that led to the discovery of the ballistic recovery system flaw.  A risk management algorithm identified several converging lexeme trends within the SpaceX Mars Program and incredibly in general aviation lexemes.  The key insight was how outside lexeme movements had such powerful affects on what was considered mostly an isolated system.

So, here we are today, July 20, 2039, the day of man’s first Mars landing.  As Commander Sophia Mandrier readies her crew for their first steps on Mars, the Asteroid Deflection Program is working with SpaceX to re-route the Dragon 11 capsule to the rendezvous Lagrange point.  Two of the largest programs ever attempted by mankind, one for our survival, and one for our dreams are being conducted concurrently.  Both programs are reporting lexeme defects at seven sigma.  Success is the high confidence forecast.