Brainstorming is for solutions, not for ideas.


What headline should I use? A story about a camel?

Here is why: people don’t gather around the campfire, buckle up for smarty pants time and start throwing ideas into the air! People tell stories around the campfire. Also, families don’t sit at dinner, put the genius hat on and emit weasel ideas to each other, families share their life.

In general, gatherings are where we humans put our experience out, mix it with the one the others put out, and then absorb it back in, as a better, fuller and more optimised interpretation of life.

That is the reason why brainstorming for ideas is a horror that must be ended.

Brainstorming, everyone’s favorite fun or worst curse, is a good practice only if employed at the right time. Before brainstorming, there are previous steps to consider.

First, you set on the idea through ideation. Second, you reach the agreement that it’s a good idea through analysis. Only third do you brainstorm, and you do it to find solutions.

Brainstorming solutions are basically a plan on how to make the idea a reality.

I hate committees, and you should too


Ideation is the process of selecting and fine tuning ideas, not the process of finding ideas.

The idea to search for ideas is stupid. Yes, there are stupid ideas, plenty.

If one is a person without ideas no process or activity will help, it is a personality aspect.

Many introverts falsely believe to not have ideas because they lack the easiness of spelling it out loud. That is why before any ideation process, every participant must make their own set of ideas, filtered by their own values and knowledge. Then they present their set, in turns, unintrerupted, so that introverts get their chance to express themselves.

Then the group waits for incubation.

This is the biggest missed step everywhere, incubation.

Without incubation, no creativity, only ingenuity, which is a very different beast. After incubation, you can start ideation, the process of picking the right idea and forming it to become more general than the original, a.k.a. fine tuning the idea.

Spotting ideas is a personal activity.

I don’t think we can search for ideas, and therefore ideas cannot be found.

Ideas surface out of personal interpretation and we spot them or not. Sure, we all like to bounce concepts, stories and leads with one another, but after we bounce this in small talk, or water cooler interesting chat, or say, after meeting impressions, or sharing personal events, we melt what we got back from the bounce in our own, personal and unique interpretation.

ideas are the patchwork of originality inside your model of the world

Ideas then surface as part of our view, our personal take on the subject matter. The skill is to spot ideas: the patchwork of originality inside your model of the world, regarding the subject at hand. It can be trained like any other skill, but that also means idea spotting can be a talent. In fact, the truely skilled or talented spot ideas in other people’s minds.

Without incubation, no creativity, only ingenuity, means that with ingenuity you pull out undone concepts from your incomplete model on the subject.

Ingenuity is a product of the randomization process which any brain does to find possible predictions. Ingenious people are prone to produce volumes of ideas and, failing to see the haste, we waste, money, time and energy on possible stuff. But anything is possible.

Ideas are when the brain finds probable suff.

Probability appears when possibility is immersed in inference and deduction.

Incubation is needed because, while deduction can happen on the spot, inference is based on the time consuming retrieval process inside our brains. Retrieval of memory is complex and usually requires the activation of weak neural systems, which in turn take time to build enough excitation.

Intuition is when ideas are correct, meaning, when we use exclusively true facts and data in the model we build on the subject and stick to that to come up with original bits.

Time is of essence for good ideas.

Look closely and you’ll see how the biggest business people gave themselves the time they needed to incubate. Sometimes that incubation time is their actual career, sometimes is their hobbies, sometimes it is their workaholic habits, but it is always something that buys time for incubation.

A mission is an idea, a vision is a solution.

They can occur in inverse order, that means you may have your vision and extrapolate a mission from it. That is what extrapolation does, generating ideas from solutions. That is why it is possible to stumble in a good business and get a mission and a vision later, in the growth stage.

Ideas are a dime a dozen, good ideas are priceless. Ideas are a dime a dozen, because an idea is the pretty sister of an opinion. Good ideas are priceless, because we see their real value only late after execution.

There are plenty stupid ideas.

Ideas become stupid when the emitter fails to see that it is only an idea, and confuses it with one of the following three things: an explanation, a theory or an instant solution.

You will not find very often people who can formulate explanations. A solution is not an explanation, and this is why we keep repeating mistakes: after embracing a solution we don’t invest in getting the explanation. Explanations are comprehensive stories on what occurred, complete with how it could have been prevented. There are many solution finders around, but few explainers.

Theories are great in business, not only in science. You know, the whole drill: assumptions, predictions, ways to test, demonstrations and the awesome q.e.d. at the end. For some reason we fail to incorporate this model of discovering and saving knowledge in business, despite an acute need for it.

If you think it is because business is more about practice and less about theory, I hope you do realize that some businesses are so removed from the practice they were once founded on, because of their size, influence and history, that they have become social experiments a long time ago. Where is it a better place for finding theories that work?

Experience gives ideas. Expertise gives solutions.

Knowing what you get from either experience or expertise, is where you make your mark as a manager in keeping people engaged, spending your hiring budget properly and creating roles.

A person with both experience and expertise is rare, because experience is general and expertise is specific. The unicorn executive has specific experience and general expertise. Specific experience is findable because of luck, but general expertise is extremely rare, because time is limited for everyone.

Leadership means trickle down meaning

There is no trickle down economy, but there is plenty of trickle down things in business: objectives, direction, vision, mission, empowerment and other parts of leadership.

In fact, leadership is all about trickle meaning from the top to the lower layers in an organisation, answers from “why am I here”, and “why should I stay here”, to “why would I sacrifice anything here”.

Management can be a seriously awesome activity and executive level management even more so, as it has the opportunity to offer people true unmediated value in the form of meaning, the missing ingredient of existence.

Here is a funky camel for you:



How to code

The computer is an asshole

Learning to program a computer is a lot like talking to a person you’ve barely met. You are kinda awkward for a while, they are also a bit of holding back on emotions and both of you try not to screw it up.

When you program a computer you will write your things so that the computer understands what you want.

The script

For starters you will define some common ground.

This is a: car A car has: wheels, doors, body, seats, windows, engine. A car can: start engine, stop engine, move, halt, move faster, move slower, steer (in some direction), wipe windows, signal (the direction change).

You will then explain how things are done.

How to: drive. Car! Start engine! Move! Move faster! Move slower! Signal (left)! Steer (left)! Move slower! Halt! Stop engine!

And when they are done in what specific order:

How to: start engine. Get the key! Place key into slot! Rotate key! Wait untill ignition sound! Release key!

Control structures

The computer is an asshole but time is a son of a bitch. It changes everything and because of that we need to check first whether time has left our universe in place of if it ruined everything we know.

For example:

  • if I’ll still have my wooden house, I am coming home from holiday and I will cook myself some fish.
  • while the fish is frying, I will make some garlic sauce
  • for each garlic, I need to to the same shitty peeling operation over and over.

High level, low level, scripting, assembly, compile, memoize, recursive, lambda, clojure and more jargon

Smart people tend to formalize a lot.

The problem with formalization is also its main benefit: it limits interpretation.

In a heavy formalized environment limiting interpretation helps because it makes communication fast and precise, but, if the emitter and the receiver have varying levels of formalism assimilated to their culture, then it will make communication not only slower, but seriously damage the information density in it.

Interpretation is the activity of relating what you receive with what you stored so that the new makes sense and is connected to the old.

Formalisation is therefore self referential, as the more abstract it gets, the more it requires previous levels of formalization to be assimilated. So it is a bad idea to teach programming using mathematics.

What if you are not math enabled? What does a “control structure” mean to an art graduate trying to make a circle spin on the damn screen, even mean?

The problem with computers is that they are machines.

Machines will not instantly help you unless you ask them to and unless someone made them helpful in the first place.

The computer is an asshole. It does nothing unless you shout at it: do this! Then it stops. Like a brat.

When to Delete Everything

Did you notice how we create complexity to solve complexity? I did, and then I wanted to delete everything.

Tabula rasa means blank slate. It is coming from the requirement of erasing by melting the layer of wax used to write things on back in the Roman days. In time it has transcended this mundane meaning into a full fledged philosophical concept. I am asking when to dive in it.

All systems decay because of increasing complexity.

Look at the world. Most of the problems we’re experiencing are because we kept adding layers or abstraction, special cases, dependencies, external assets and most likely other forms of complexity.

Complexity tends to increase because of constraint. Constraint acts as a factor of progress in the blank slate phase, or initial state. In the beginning, constraint harbors complexity-decreasing innovation. However, in later stages constraint creates exclusively complexity-increasing innovation: time constraints bring dependencies, resource constraints bring external assets, architectural constraints bring special cases and finally, the biggest barrier, human capacity of managing detail brings layers of abstraction.

All complex systems have a decay level inversely proportional to decoupling. That means we have fake complexity, a complexity which only appears to us because we don’t see the parts of the system, either because we don’t filter out information correctly or because our information is incomplete.

little known fact of life, pictured above

The opposite of complexity is not simplicity

A simple system solves a small problem. That is the biggest mistake we make in reducing complexity: simplifying systems. By introducing actions that increase simplicity, the system will by default not be able to solve the same problems. We’re tricked by ourselves because the system will be able to solve the same kind of problems, just not the same problems.

The generator of complexity is evolution. We’re constantly pushing systems to do more and we try to reuse as much as possible, because of constraints. So the system evolves. It appears as if it is getting better, flabbergasting us with the decline in effectiveness.

It is the same principle found in biology, and probably the main argument of evolution based atheism: if we were made from a blank slate there would have been better decisions in place. But we were not, we are features stacked on top of an ancient chassis.

Evolution granted us access to capacities far beyond the original design intended, but by doing that, the overall system decayed in a state of immense complexity: a psychological level of abstraction that is almost disconnected from the basic needs of survival (suicide), brain development added the huge dependency on others (immense social needs), our capacity of making tools brought in huge amounts of external assets in one’s life (extraction of resources from environment) and finally our self reflective natures that make our inner selves look like an endless spiral of nested special cases, a kind of expert system that is good at running only one single precise life, yours, not any other.

Use clarity

So what is the opposite of complexity then? Clarity. Clarity is the product of adaptation.

We are too afraid to start over. We repeat the “rewrite” mistake whenever we start over.The rewrite mistake is the bad assumption that a start over includes rebuilding what is already built. That’s why revolutions suck at solving social problems. When we rewrite the basic things we start by being the same flawed agents so we incorporate the same mistakes all over again in the new design. Plus, we loose all the countless iterations of progress and advancement which perfected regions of the system to crystal clarity.

Starting over implies a few steps:

  1. identify if the problem is that big in the first place so that it requires a system
  2. export processed and formatted data from the current system in all aspects defined in the problem
  3. open the current system in all places that affect the data it exports
  4. build the new system using data storage and processing from the old system

Old systems can be adapted almost endlessly and there is only one kind of event that makes adaptation impossible: cataclysmic events. There are two kinds of cataclysms: acute and chronic. Acute ones disrupt the world of the system so deep and hard that the problems it solved are no longer available. Chronic ones are pervasive and subversive, they steadily but relentlessly alter the nature of problems, so much that at some point the system solves irrelevant problems. The meteor that changed the climate which led to the dinosaurs going extinct is an acute cataclysmic event. Humans are a chronic cataclysm for ecosystems. New types of exclusively digital money is a chronic cataclysm for the banking industry. And so on. Only evolution solves cataclysmic events.

So, in short, tabula rasa should always mean, bring another tabula and leave the curent one as legible as possible.


small update

Are web developers any good in times of war?


Does CSS have any worth for an army officer? I personally guess not, from the front end we’ll end up directly on the front! If all your skills are about HTML and CSS you future might be bleak. Maybe some colonel will want a personal website, i don’t know like a thumbnail gallery of AK models, they’ll use and then dispose you as a liability in the attack line of some distant battle.

Damn you Erdogan! I knew a had to learn Python faster!

My Node experience could come in handy, I could spell it out as “real time services building”, that should be worth something, a war force must use real time services. OK, one point scored.

PHP? Oh my god. not really. I mean I bet the military IT guy who reviews the recruit records hates PHP. He must since he or she probably still does FORTRAN and COBOL. What, you don’t really think the person analyzing your file is some NSA computer guy? Ha ha. No it will be some low ranking person, who still uses huge floppy disks I bet!

Ruby? Worthless. No one cares for your elegance in the army, sorry. Numbers are objects … what is this, mutiny? Mixins? When you have a chain of command? Hell no!

Hmm, maybe Javascript will score me some points, because of node. I’m sure our beloved Internets will revert to text mode, as the luxury of 1 Gbps at home will be long gone. But even then, mind you, shall you ever code for the army with JS there will be no framework to have your back. In the army $ means something else and you don’t have it.

Hmm. Add to your file gulp, sass, less, webpack, react, angular, cordova, responsive, flat, canvas buzz buzz buzz zz zz zzz, when the recruit person wakes up you’ll get dispatched to washing the dishes, which is not bad.

Oh, you have Agile process experience? In the army Agile will get you court martial-ed in no time. Individuals and interactions over processes and tools? Think again soldier! Responding to change over following a plan? Gimme 20 push ups son, maybe SCRUM will sweat out of you!

I know! I’ll rephrase my vagrant pains as “virtualization of workstations for efficient distribution of software development environments”. That sounds good, sounds important.

Unlike, say, my experience setting up AWS and EC2, those are property of the US army so I’ll have to revert to booting some Apache on Lenovos again. You don’t want to tell your commanding officer the enemy’s technology is better and more practical, unless you like solitary!

I also think in case of war, finally, SVN will have a comeback. Huzzah! I mean, fast branching? Who authorized that branch? Did you get the permit to pull from master? Damn it son, do that again and you’ll cook so much oat meals you’ll be dreaming them at night.

One thing that makes me smile: can you imagine online marketers? Social media managers? What will they ever do for their country without all those cookies?

Coding shall be another solved problem


What will be to [web] development what the phone was to photography?

Machine Learning

Lauren Mendoza said it right in her Coding is over rant. But because she has the beginner’s view all over her sentences everyone got so so so mad. The problem why coding is over is not CRUD, frameworks, CMS’s or because “coding is dumb”.

Coding is over because machine learning is here to stay.

Don’t worry, no SkyNet is coming. I don’t mean computers will program themselves. I don’t mean software will pop into existence inside scary machines. I mean that the essence of the way we program computers today, by scripting their actions will gradually fade away making room for new concepts and practices which enable A.I. development.

But for now, if you worry about coding, then the first item for your concern is that coding today is a trade (a craft). It is a metaphor or compliment at best to call someone a “front end engineer”.

Engineering is the application of mathematics, empirical evidence and scientific, economic, social, and practical knowledge in order to invent, innovate, design, build, maintain, research, and improve structures, machines, tools, systems, components, materials, processes and organizations. says the wiki

Because coding is becoming a trade, a craft, coding is common, over staffed, and it requires efficiency added to it. Any trade in history went through this cycle: initial boom based on scarcity and requirement of individual skill, followed by innovation removing individual skill and scarcity. We’re simply waiting for the industrial revolution of computer programming, which today is a lot like manual labor.

Programming a computer is no different today than it was a while ago to make clothing or shoes or iron. But when technology appeared that made outputting clothing or shoes a million units per week, then it was over. In computer programming the revolution of the industry is artificial intelligence.

Do you know what should coders fear? Here is a quick list:

  • that they don’t know maths: this is what computers run on, mathematical constructs, and if your code editor is a word processor not a mathematical problem solving tool, you’re in trouble.
  • that they don’t know statistics and statistical models: in the near future about everything will consolidate and to work with big data you need modeling like you need air.
  • that they don’t understand graph theory: you are a graph yourself so you should learn about it even outside of professional endeavors.
  • that they don’t grasp recursion and functional patters: basically if all you can write is “code” you will be replaced if not physically at least conceptually.
  • that they don’t know algorithms by heart: oh the favorite whining of everyone (me included) about big co interviews, but the truth is simple: you want your surgeon to know all those darn anatomical teeny weeny parts inside of you, right? Then a person programming the brain of my online existence better know their algorithms, I say!
  • and that they don’t know python: you gotta know python. Like really, you gotta know python! 😀

In other words, coding is over unless you really are an engineer.

Fear not your lack of education, fear only your basic skill set.

I don’t know (m)any of the things above and I fear for myself because I am still trading in code. I am learning my ass off so that I will not be a deprecated resource when I turn 40 and start shouting “ageism!”. Oh, and I also convinced people to call me a manager, haha, maybe that’ll do the trick before I must go through those darn Hadoop tutorials.

Machine learning will change everything. Everything.

It will be the industrial revolution all over again but it will also revolutionize its own industry: information technology. Who will require “websites” when the computer will interact naturally in all kinds of shapes and forms? Many small businesses. Therefore coders will become one of two things: low paid workers or Swiss watch makers, in both cases nothing like what coding is today: a sellers market.

Coding is dumb?

I don’t know when or how it happened, but at some point, “IT person” became “developer” which became “software engineer”. says Lauren

“Coding” exists because of the web. Back in the dark ages before it, you had a very, very high entry barrier before calling yourself a programmer. Many abstraction upon abstraction and protocol upon protocol iterations later and you have CSS, HTML and the DOM and we have you writing “code” for “tasks”. No architecture, no pattern, no design, not even elementary understanding of underlying technology which makes your “coding” possible. Hordes of coders which get thicker and thicker by an educational worldwide effort to transform every miner into a coder, every school child into a coder, every single person into someone who can write code.

Yet, coding is required. From the perspective of software eating the world, you must mutate to digest it when you choose the fight to eat it back: that means learn to code the software. Or, you choose the flight option and quit and go to a poor warm weather country and mock us the fools who fight the invasion, indirectly sponsoring the possibility of your nomadic lifestyle. Both work.

My point is, by all means learn to code, be a coder, but expect it to be a minor achievement in your life, career and future.

Tech recruitment works like figure skating competitions


And it sucks. Here is why.

First you must do the arbitrary sport exercises, which have very little to do with actual skating: a lot of jumping, twisted jumping, jumping on one leg, jumping on the other leg, grab a person and throw it in the air, you know, like all these things which do not involve skates at all. Just like your whiteboarding algorithms written with pseudocode. We test athletic abilities in a sport concerned with gliding on ice, just like we test computer science in a trade concerned with programming interpreters. I mean, seriously, most jobs out there in web development do not require programming a computer, it is just scripting an interpreter to program the computer.

Then comes the artistic part. This subjectivity loaded section of the figure skating show constantly has people booing and shouting at judges. It is what we call in recruitment, finding the culture fit. The culture fit hunting is a seduction game, more than an interview. And it is wrong, just as wrong as not understanding “presentation” or “artistic” reasons for which your favorites loose.

That is why I always prefer ice dancing. I don’t even bother to watch the other disciplines. Ice dancing looks very much like, you know, skating. And no, I have no metaphor that transforms ice dancing into technical recruitment.

Too many companies want to hire the “full stack generalist”.

That is fucked up.

Becoming a “full stack generalist ”should be an aim for developers. A good developer, at some point in their professional development, will become a full stack generalist.

First of all a full stack generalist is a weak idea, an ideal state, a unicorn. Do you go and have your heart surgically repaired by a generalist physician or a by specialized surgeon? Do you know of a lawyer who is OK representing in any kind of trial or are there divorce lawyers and criminal law layers and so many other types of lawyers? I am probably the umpteenth idiot using these comparisons. There will be people screaming: it does not apply. But, in the end, seriously, be honest: if you have used for the past year the same tech 80% of your time to give 80% of your output, you are not a generalist; considering you will always be in a team, other than debugging critical production code, being full stack is of very little use.

I think recruitment would become better if people would make up their minds.

Do you want to hire system architects? Ask system architecture questions.

Do you want to hire someone who can help the system architect choose the best algorithms? Ask algorithm questions.

Do you want to hire someone who writes PHP or JS or Ruby or CSS for 80% of the time? Ask damn PHP or JS or Ruby or CSS questions.

Do you want to hire a “polyglot” because your software is a mishmash of technologies? State so in the recruitment phase, don’t enjoy humiliating specialists.

Technical recruitment is lazy.

You know why? Because a lot of candidates are self taught and very, very young and a lot of technical recruiters are self taught and very, very young. Because the salaries are high and companies assume they require “top talent” to get the job done. Really, they don’t. Most startups and corporations do well with average coders. Even Google and Facebook, with their huge scaling issues have plenty of positions where no rock star is required, yet they tend to interview each candidate as if they’d program the launch controller on the Falcon.

The salaries for average and beginner programmers are high because of the freelancing market. It is a mistake to assume it is because companies are fighting over them. Companies fight over top talent, not average WordPress theme cruncher. Sorry, I am a WordPress theme cruncher too, we all are, but some of us are top talent, full stack generalists who happen to crunch WordPress themes, they are not average.

If you learn to code believing companies will fight to get you, you will be in for a really bad surprise. Companies will not give you a job, just because you’ve done your fair share of Treehouse and Udemy. Companies use two strategies to hire. One, they leave it to other developers, wrongly believing that it is good, and most often creating “bro hoods” and sausage parties. Two, they use the same dated procedures that assume “programming” is like plumbing or, at best, engineering, creating teams of code typists. Right after you learned “coding” you don’t qualify in either of these groups: you are not a bro yet and you are too idealistic to program with Java for the rest of your life.

Everybody wants experience, passion, proven track records, A players, outstanding personalities.

This is bullshit.

Most people are average and that applies to programmers too. I have a scale: coders, programmers, developers, architects, innovators. It might be bad, so far it works for me. I believe recruitment should have a very clear picture of what it recruits and allow growth. It has been a while for example since I’ve seen anybody specifically looking for juniors, juniors mind you not graduates.

If you believe that having a person you pay a top salary to slice PSD files is OK, because he or she is a “front end developer”, you are wrong.

If you consider that a senior developer, with 10 years of hands on experience, paid at market level, should spend eight hours a day putting data into Smarty templates, you are wrong.

The plague of boredom leaves hard to conceal scars on your product.

It all starts with understanding two basic things about programming:

  1. just about anybody can eventually program an interpreter
  2. learning how to code is a basic skill just like driving a car is a basic skill

I often think about how hard it was for my father to get a driver’s licence. He had two theoretical tests, a test of driving aptitude, a night and a day exam on the road. Today its about as easy as asking for it: give me the licence, i bought a car. It’s that easy. That is exactly how it will get in time with programming interpreters. There is an amazing effort into building higher and higher level abstractions of coding. People don’t program computers anymore. There are some, highly specialized programmers who work with the machine. We work with a pampered version of it: the interpreter.

One who can reproduce with HTML and CSS a .psd file, blindfolded, while having a foreigner describe in bad English the contents of the .psd file, does not need a computer science degree. It needs to understand the web browser rendering interpreter, that is what HTML and CSS program. The better he understands it, the bigger the seniority level. The more up to date with rendering engines she or he is, the better an asset at hiring time. There.

One who can write with PHP whatever it is required in a specification: an algorithm, a data flow, functionally or object oriented, with minimal security bugs, efficiently, logically and all while implementing best practices, does not need a computer science degree. It needs to understand the PHP interpreter. That is what PHP programs. The better they understand it, the less they whine and bitch on the Internet about it, and ship fast web products that work , because that is what PHP is for. The more up to date with PHP interpreters, alternative or official, he better an asset at hiring time. Easy.

None of the above need to know algorithms by heart. In fact, I would say that the sole purpose of learning things by heart is for quick reaction time. I find that it is better to ask function names in scripting languages and to see if one knows immediately what is built in the interpreter or the SPL, that is of far more help in quick reaction scenarios. For algorithms, you need to take your time anyway. It’s like imagining that if a nuclear power plant explodes, you do a Binary search tree on a whiteboard to find the best plan.

One who writes an interpreter or, say, one who programs hardware controllers, they do indeed need a computer science degree, because they are programming computers. How many of these whiteboarded people program against a compiler? But they need the computer science degree not for knowing things by heart, but because it is required knowledge, if you want to program computers, to understand why and how computers are programmable in the first place. They have another set of problems. A 22 y.o. computer science graduate going into robotics has the same absolute value as a 22 y.o. self taught programmer who developed for the past 4 years Javascript apps and goes into web development. They have different domains of knowledge. The graduate will do sniff at web development and the JS coder will program a robot lost in callback hell.

So, to fix recruitment we need to:

  1. train recruiters into what are the type and levels of the programming trade
  2. stop being superior assholes and look for talent and determination where we least expect it
  3. bring people on board and wait on them to catch up
  4. hunt down exceptional full stack generalists
  5. grill down at the whiteboard those who should actually do anything at the whiteboard after we hire them
  6. let people show what they can do, it is really not important what they CAN’T do, that is simply an elitist bullshit. You ask someone: what can you give? they say: THIS, you value that and offer some money. That is how it works. You don’t go to the market and ask the seller: what color do these oranges NOT have?
  7. train teams in the company on how to do interviews
  8. have procedures, not winging it with random questions that other bored programmers make up for fun
  9. measure recruitment success, recruitment conversion rates, recruitment diversity and inclusiveness
  10. be humans and let others be humans back at us

Can we do it?

Are you wondering?

Doesn’t the Internet solve the representativity problem? Will we continue to need a handful of “special” people to “represent” us? Shouldn’t we all already be political actors in society having power, real power into the back pockets of our trousers?

Can’t we just disrupt the government already? Didn’t technology already shake the establishment enough so that we witnessed its flawed design and profound limitations?

We don’t really require representation like we used to anymore. Every question can be answered instantly. We need to hire people as a society who actually are paid to execute on plans that we collectively agree upon.

We don’t really require years and years of so called “political stability”. The distributed financial systems promise to break us free from the whims-of-people disguised as “markets”, so political stability is a term forced upon society as a mere convention in the end.

Truth be told, programmers should implement laws. The whole corpus of laws should be a working software. Parliaments would devise specifications for laws and they’d get implemented as execution plans and algorithms. Probably the best use of TDD ever.

Can an machine learning devise better laws? I mean, given a situation, isn’t it predictable what laws we should devise based on our past set of existing laws?

What does being “alive” mean?

There has been a great deal of discussion about artificial intelligence and the potential dangers it poses to humans. Then there have been countless iterations of interpreting via stories moral lock downs such as robot rights, slave robots, artificial will and so on. But the main question regarding A.I. is

when exactly does A.I. become more than just a fancy tool?

My main hypothesis is that artificial intelligence will not be more than a query answering robot unless we factor in it unknowns and inevitable, time based, cycle endings. In particular we need a system who fights for its existence, a system that runs in a scarce resource, competitive environment, even more, a system that cannot access all the answers.

Artificial life is not only intelligent, it has the same goal as life in general: to defeat time because it is threatened by it. And, when we’ll create it, the artificial part will be lost, because, in the end, we’re meant to do it as our next evolutionary action towards time resilience. (more)

In that sense, we could say something is alive if:

  1. the base goal is to preserve low entropy
  2. all actions are determined by a predestined time resilience
  3. it is exposed directly to the effects of time, in particular decay

Even if a being is immortal, all three of above can still apply, and the being is alive.


The goal of any living being is resilience as a form of preserving its status.

the formula of life

where R is resilience, S is status and P is preservation. To ease calculation R, S and P can be integers, but in reality R is a time delta, S is a matrix and P is an algorithm.



Status is a complex notion that sums up genetic, biologic, cultural, societal and other states. Preservation is an algorithm that employs innovation, mutation and other methods of predictive feedback initiation. Both status and preservation change in time based on environment updates.

At birth a living being has P = 0, therefore R = 1.
At the time of death, a living being has S = 0, therefore R = 0.

If we create an A.I. system and aim to make it work above the limitations of a fully determined program, we require the incorporation of the formula of “life” in the main loop. The stability of the main loop must decay due to direct interaction with a physical hardware clock.

A decaying main loop is created by updating a composite variable on every tick of the hardware clock. Say we have a variable made of many parts such as A..n then there is a homeostasis function H that produces a result of a composite variable A..n.

A..n = H(A..n, S, E), where S is state and E is environment

Each part of the composite variable is then injected in the loop as values for various internal parameters. The H function applies data from state and environment to A..n. State is the current execution state and environment is the input of the current execution state from sensors and detectors.

Thus resilience is not embedded in a system, it rises naturally from decay based on external action. The better the preservation the bigger, exponentially, the resilience. The better the status (both complexity and connections, as status is a sum of states) the better the resilience.

If we build an A.I. that is generating questions, not answers, using answers simply to point to new questions awareness should arise by itself, but only in the presence of decay and resilience, otherwise we’ll never know if the system is aware as it has no reason to expose it.