The circular economy : Nature News & Comment

A new relationship with our goods and materials would save resources and energy and create local jobs, explains Walter R. Stahel.

When my battered 1969 Toyota car approached the age of 30, I decided that her body deserved to be remanufactured. After 2 months and 100 hours of work, she returned home in her original beauty. “I am so glad you finally bought a new car,” my neighbour remarked. Quality is still associated with newness not with caring; long-term use as undesirable, not resourceful.

Cycles, such as of water and nutrients, abound in nature — discards become resources for others. Yet humans continue to ‘make, use, dispose’. One-third of plastic waste globally is not collected or managed1.

There is an alternative. A ‘circular economy’ would turn goods that are at the end of their service life into resources for others, closing loops in industrial ecosystems and minimizing waste (see ‘Closing loops’). It would change economic logic because it replaces production with sufficiency: reuse what you can, recycle what cannot be reused, repair what is broken, remanufacture what cannot be repaired. A study of seven European nations found that a shift to a circular economy would reduce each nation’s greenhouse-gas emissions by up to 70% and grow its workforce by about 4% — the ultimate low-carbon economy (see go.nature.com/biecsc).

The concept grew out of the idea of substituting manpower for energy, first described 40 years ago in a report2 to the European Commission by me and Geneviève Reday-Mulvey while we were at the Battelle Research Centre in Geneva, Switzerland. The early 1970s saw rising energy prices and high unemployment. As an architect, I knew that it took more labour and fewer resources to refurbish buildings than to erect new ones. The principle is true for any stock or capital, from mobile phones to arable land and cultural heritage.

Circular-economy business models fall in two groups: those that foster reuse and extend service life through repair, remanufacture, upgrades and retrofits; and those that turn old goods into as-new resources by recycling the materials. People — of all ages and skills — are central to the model. Ownership gives way to stewardship; consumers become users and creators3. The remanufacturing and repair of old goods, buildings and infrastructure creates skilled jobs in local workshops. The experiences of workers from the past are instrumental.
Source: The circular economy : Nature News & Comment

 

The 4 Rituals That Will Make You An Expert At Anything

How do you become an expert? Here’s an interview with the professor who created the 10,000 hours theory of expertise that will teach you how to be the best.

We hear a lot about “10,000 hours” being what it takes to become an expert. But the majority of people totally misunderstand the idea.

So I decided to go to the source and talk to the guy who actually created the theory.

Anders Ericsson is a professor of psychology at Florida State University. His wonderful new book is Peak: Secrets from the New Science of Expertise.

So what does everybody get wrong? 2 things.

First, the “10,000 hour rule” is not a rule and it’s not an exact number. The amount of time varies from field to field. It’s an average. But it’s always a lot and more is better. Here’s Anders:

In most domains it’s remarkable how much time even the most “talented” individuals need in order to reach the highest levels of performance. The 10,000 hour number just gives you a sense that we’re talking years of 10 to 20 hours a week which those who some people would argue are the most innately talented individuals still need to get to the highest level.

What’s the second mistake? Becoming an expert is not merely doing something over and over for 10,000 hours. There’s a right way — and an awful lot of wrong ways — to spend that time.

Let’s learn the right way…

Sum Up

Here’s what Anders says can make you an expert:

  • Get Help: Find a mentor who can help you develop that image in your head of the best way to do something.
  • It’s Not “Try Harder”, It’s “Try Different”: Design specific activities to address your weak points.
  • It’s About Doing, Not Knowing: Remember the three F’s: Focus, Feedback, Fix it.
  • Study The Past To Have A Better Future: Find examples that have been judged and quiz yourself.

Don’t worry; you do not have to be a genius to become an expert at most things. In fact, Anders says it might be an advantage not to be a genius.

When elite chess players were studied, the ones with lower IQ’s often worked harder and then did better because they felt they were at a disadvantage.
Source: The 4 Rituals That Will Make You An Expert At Anything

 

d’Oh My Zsh — Medium

How I unexpectedly built a monster of an open source project

This wouldn’t be my first foray into open source software; nor my last.

It was the summer of 2009. I found myself helping a coworker debug something in their terminal. As I attempted to type in a few command lines, I noticed that the prompt wasn’t responding to the shortcuts that my brain had grown accustomed to. Frustrated, I exclaimed, “when are you finally going to switch over to Zsh?!”

(yeah, I was one of the type of annoying coworker that would constantly point out that X was better than Y when given the chance. In hindsight, I don’t know how they put up with me…but between you and me, I had a point.)

“when are you finally going to switch over to Zsh?!”

At that point in time, I had been a daily Zsh user for a little over three years.

Some of my #caboose friends shared a few of their .zshrc configurations within our IRC channel. After a few years, my .zshrc file grew into a tangled rat’s nest. Honestly, I didn’t know what ~30% of the configuration did. I trusted my friends enough to run with it, though. What I did know was that I had some git branch and status details, color highlighting for a few tools (i.e., grep), autocompleting file paths over SSH connections, and a handful of shortcuts for Rake and Capistrano. Working on a machine with a default Bash profile felt remarkably archaic; I’d become dependent on these shortcuts.

Source: d’Oh My Zsh — Medium

 

NPM & left-pad: Have We Forgotten How To Program? | Haney Codes .NET

Okay developers, time to have a serious talk. As you are probably already aware, this week React, Babel, and a bunch of other high-profile packages on NPM broke. The reason they broke is rather astounding.

A simple NPM package called left-pad that was a dependency of React, Babel, and other packages. One that, at the time of writing this, has 11 stars on GitHub. The entire package is 11 simple lines that implement a basic left-pad string function. In case those links ever die, here is the entire code of left-pad:

1
2
3
4
5
6
7
8
9
10
11
module.exports = leftpad;
function leftpad (str, len, ch) {
  str = String(str);
  var i = -1;
  if (!ch && ch !== 0) ch = ' ';
  len = len - str.length;
  while (++i < len) {
    str = ch + str;
  }
  return str;
}

What concerns me here is that so many packages took on a dependency for a simple left padding string function, rather than taking 2 minutes to write such a basic function themselves.

As a result of learning about the left-pad disaster, I started investigating the NPM ecosystem. Here are some things that I observed:

  • There’s a package called isArray that has 880,000 downloads a day, and 18 million downloads in February of 2016. It has 72 dependent NPM packages. Here’s it’s entire 1 line of code:
    1
    return toString.call(arr) == '[object Array]';
  • There’s a package called is-positive-integer (GitHub) that is 4 lines long and as of yesterday required 3 dependencies to use. The author has since refactored it to require 0 dependencies, but I have to wonder why it wasn’t that way in the first place.
  • A fresh install of the Babel package includes 41,000 files
  • A blank jspm/npm-based app template now starts with 28,000+ files

All of this leads me to wonder…

Have We Forgotten How To Program?

Source: NPM & left-pad: Have We Forgotten How To Program? | Haney Codes .NET

 

A quick look at the overall structure and some interesting aspects of PostgreSQL

To get and compile the code, run the normal commands:

$ git clone git://git.postgresql.org/git/postgresql.git

$ ./configure
$ Make

(Interestingly, the Makefile is actually committed into the repository, so you might not even need to run the configure script.)

Before looking at the code, try to imagine what the overall structure might be, based on what it needs to accomplish. There needs to be a network component, which hands data to a parser, then maybe an optimizer, and some code to actually run the queries. There is of course a docdirectory and README files, but let’s jump right in to the src directory.

Structure is the key to understanding when it comes to code, and a diagram might help, but here we’re trying to understand the structure looking at the code itself.

The broad division in PostgreSQL is between the frontend (libraries/CLI making requests over the network), and the backend. The backend is probably more interesting. Run $ cd src/backend; ls and we find this directory listing (among others):

Makefile	common.mk	main		po		replication
access		executor	nls.mk		port		rewrite
bootstrap	foreign		nodes		postgres	snowball
catalog		lib		optimizer	postmaster	storage
commands	libpq		parser		regex		tcop
tsearch 	utils

Woah, what is snowball? cd snowball; cat README. Looks like a grammatical library for natural language processing, designed to find the stem of words: for example, from the word “eating” it will find “eat.” Apparently PostgreSQL lets you search text based on the stem of the word. That’s cool, learning already.

Optimizer and parser are easy enough to figure out, what about postmaster? Does that sound like a network layer? less postmaster/postmaster.c and we see this beautiful comment:

/*-------------------------------------------------------------------------
 *
 * postmaster.c
 *        This program acts as a clearing house for requests to the
 *        POSTGRES system.  Frontend programs send a startup message
 *        to the Postmaster and the postmaster uses the info in the
 *        message to setup a backend process.
 *
 *        The postmaster also manages system-wide operations such as
 *        startup and shutdown. The postmaster itself doesn't do those
 *        operations, mind you --- it just forks off a subprocess to do them
 *        at the right times.  It also takes care of resetting the system
 *        if a backend crashes.
 *

So PostgreSQL has a high-level message concept, excellent. grep socket * -r leads topqcomm.c, which contains the low-level network routines. It holds another nice comment:

/*------------------------
 *INTERFACE ROUTINES
 *
 *setup/teardown:
 *  StreamServerPort        - Open postmaster's server port
 *  StreamConnection        - Create new connection with client
 *  StreamClose                     - Close a client/backend connection
 *  TouchSocketFiles        - Protect socket files against /tmp cleaners
 *  pq_init                 - initialize libpq at backend startup
 *  pq_comm_reset   - reset libpq during error recovery
 *  pq_close                - shutdown libpq at backend exit
 *
 *low-level I/O:
 *  pq_getbytes             - get a known number of bytes from connection
 *  pq_getstring    - get a null terminated string from connection
 *  pq_getmessage   - get a message with length word from connection
 *  pq_getbyte              - get next byte from connection
 *  pq_peekbyte             - peek at next byte from connection
 *  pq_putbytes             - send bytes to connection (flushed by pq_flush)
 *  pq_flush                - flush pending output
 *  pq_flush_if_writable - flush pending output if writable without blocking
 *  pq_getbyte_if_available - get a byte if available without blocking
 *
 *message-level I/O (and old-style-COPY-OUT cruft):
 *  pq_putmessage   - send a normal message (suppressed in COPY OUT mode)
 *  pq_putmessage_noblock - buffer a normal message (suppressed in COPY OUT)
 *  pq_startcopyout - inform libpq that a COPY OUT transfer is beginning
 *  pq_endcopyout   - end a COPY OUT transfer
 *
 *------------------------
 */

Moving back up, the lib directory looks interesting, I wonder what’s in it?

$ ls lib
Makefile	    bipartite_match.c    objfiles.txt	   stringinfo.c
README		    hyperloglog.c        pairingheap.c
binaryheap.c	    ilist.c              rbtree.c

Fascinating, PostgreSQL uses a red-black tree. I haven’t used those much since college. Here is some code from rbtree.c. It looks a lot like a college textbook:

/*
 * rb_leftmost: fetch the leftmost (smallest-valued) tree node.
 * Returns NULL if tree is empty.
 *
 * Note: in the original implementation this included an unlink step, but
 * that's a bit awkward.  Just call rb_delete on the result if that's what
 * you want.
 */
RBNode *
rb_leftmost(RBTree *rb)
{
        RBNode     *node = rb->root;
        RBNode     *leftmost = rb->root;

        while (node != RBNIL)
        {
                leftmost = node;
                node = node->left;
        }

        if (leftmost != RBNIL)
                return leftmost;

        return NULL;
}

Source: A quick look at the overall structure and some interesting aspects of PostgreSQL