Tag

Evaluation of Six Password Managers

To see the evaluation table, click on the Grid below, then “Result” tab in the embed. Initially, all of the checklist items we considered in our evaluation are listed in the table. If you scroll to the bottom, you will see a list of feature tags. Uncheck the features you don’t care about, and they will be removed from the grid. Then, you can select any two products from the two drop-down menus at the bottom, and the grid will add a column doing a head-to-head comparison of those two products with an overall score at the bottom.

Click Grid to see interactive chart.

 

Original article here.

 

Laws Of Software Development

One surefire way to sound really really smart is to invoke a law or principle named after some long dead guy (an alive guy is acceptable too, but lacks slightly in smart points).

This realization struck me the other day while I was reading a blog post that made a reference to Postel’s law. Immediately I knew the author of this post must be a highly intelligent card carrying member of MENSA. He was probably sporting some geeky XKCD t-shirt with a lame unix joke while writing the post.

Well friends, I admit I had to look that law up, and in the process realized I could sound just as scary smart as that guy if I just made reference to every eponymous(I’ll wait while you look that one up) “law” I could find.

And as a public service, I am going to help all of you appear smart by posting my findings here! Don’t let anyone ever say I don’t try to make my readers look good. If you look good, I look good.

Make sure to invoke one of these in your next blog post and sound scary smart just like me.

Postel’s Law

The law that inspired this post…

Be conservative in what you send, liberal in what you accept.

Jon Postel originally articulated this as a principle for making TCP implementations robust. This principle is also embodied by HTML which many attribute as a cause of its success and failure, depending on who you ask.

In today’s highly charged political environment, Postel’s law is a uniter.

Parkinson’s Law

Otherwise known as the law of bureaucracy, this law states that…

Work expands so as to fill the time available for its completion.

As contrasted to Haack’s Law which states that

Work expands so as to overflow the time available and spill on the floor leaving a very sticky mess.

Pareto Principle

Also known as the 80-20 rule, the Pareto Principle states…

For many phenomena, 80% of consequences stem from 20% of the causes.

This is the principle behind the painful truth that 80% of the bugs in the code arise from 20% of the code. Likewise, 80% of the work done in a company is performed by 20% of the staff. The problem is you don’t always have a clear idea of which 20%.

Sturgeon’s Revelation

The revelation has nothing to do with seafood, as one might be mistaken to believe. Rather, it states that…

Ninety percent of everything is crud.

Sounds like Sturgeon is a conversation killer at parties. Is this a revelation because that number is so small?

The Peter Principle

One of the most depressing laws in this list, if you happen to have first-hand experience with this via working with incompetent managers.

In a hierarchy, every employee tends to rise to his level of incompetence.

Just read Dilbert (or watch The Office) to get some examples of this in action.

Hofstadter’s Law

This one is great because it is so true. I knew this law and still this post still took longer than I expected.

A task always takes longer than you expect, even when you take into account Hofstadter’s Law.

By the way, you get extra bonus points among your Mensa friends for invoking a self-referential law like this one.

Murphy’s Law

The one we all know and love.

If anything can go wrong, it will.

Speaking of which, wait one second while I backup my computer.

The developer’s response to this law should be defensive programming and the age old boy scout motto, Be Prepared.

Brook’s Law

Adding manpower to a late software project makes it later.

Named after Fred Brooks, aka, Mr. Mythical Man Month. My favorite corollary to this law is the following…

The bearing of a child takes nine months, no matter how many women are assigned.

Obviously, Brook was not a statistician.

Conway’s Law

Having nothing to do with country music, this law states…

Any piece of software reflects the organizational structure that produced it

Put another way…

If you have four groups working on a compiler, you’ll get a 4-pass compiler.

How many groups are involved in the software you are building?

Kerchkhoff’s Principle

This principle is named after a man who must be the only cryptographer ever to have five consecutive consonants in his last name.

In cryptography, a system should be secure even if everything about the system, except for a small piece of information — the key — is public knowledge.

And thus Kerchkhoff raises the banner in the fight against Security through Obscurity. This is the main principle underlying public key cryptography.

Linus’s Law

Named after Linus Torvalds, the creator of Linux, this law states…

Given enough eyeballs, allbugsare shallow.

Where you store the eyeballs is up to you.

Reed’s Law

The utility of large networks, particularly social networks, scales exponentially with the size of the network.

Keep repeating that to yourself as you continue to invite anyone and everyone to be your friend in FaceBook.

Metcalfe’s Law

In network theory, the value of a system grows as approximately the square of the number of users of the system.

I wonder if Reed and Metcalfe hung out at the same pubs.

Moore’s Law

Probably the most famous law in computing, this law states…

The power of computers per unit cost doubles every 24 month.

The more popular and well known version of Moore’s law states…

The number of transistors on an integrated circuit will double in about 18 months.

And we’ve been racing to keep up ever since.

Rock’s Law

I was unable to find Paper’s Corollary, nor Scissor’s Lemma, so we’re left with only Rock’s law which states…

The cost of a semiconductor chip fabrication plant doubles every four years.

Buy yours now while prices are still low.

Wirth’s law

Software gets slower faster than hardware gets faster.

Ha! Take that Moore’s Law!

Zawinski’s Law

This law addresses software bloat and states…

Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can.

I hear that the next version of calc.exe is going to include the ability to read email. A more modern formulation of this law should replace email with RSS.

Fitt’s Law

This is a law related to usability which states…

Time = a + b log~2~ ( D / S + 1 )

Or in plain English,

The time to acquire a target is a function of the distance to and the size of the target.

A well known application of this law is placing the Start menu in the bottom left corner, thus making the target very large since the corner is constrained by the left and bottom edges of the screen.

Hick’s Law

Has nothing to do with people with bad mullets. I swear. Related to Fitt’s law, it states that…

The time to make a decision is a function of the possible choices he or she has.

Or in plain math,

Time = b log~2~(n + 1)

Seems to me this is also a function of the number of people making the decision, like when you and your coworkers are trying to figure out where to have lunch.

 

Size Comparison of the Universe

From the Planck scale to the cosmic scale, the Size Comparison of the Universe will show you just how large our universe is!
This is an update to my previous size comparison video published last year in 2017. Song source(s) (In order): 1. Mattia Cupelli: Epic Powerful Emotional Music | The Phoenix (Download and Royalty FREE) – https://www.youtube.com/watch?v=b6MS-… 2. Ross Bugden – Music: Emotional and Dramatic Music – Reverie (Copyright and Royalty Free) – https://www.youtube.com/watch?v=CZ72Z… 3. Savfk – Music: Odyssey by Savfk (copyright and royalty free dramatic, orchestral, intense, epic soundtrack music) – https://www.youtube.com/watch?v=XF8mF… 4. Lionel Schmitt: Epic Music – 1000 Ships Against The Wind (Big Orchestral Fantasy With Choir) – https://soundcloud.com/lionel-schmitt… Image, song, video and thumbnail source(s): https://drive.google.com/file/d/1e3YR… There are many improvements this year, the first true 4K video, the use of cameras in the video, masking, improved color depth and much more!

A Brief Totally Accurate History Of Programming Languages

Every programmer has strong opinions on what are the best programming languages.  In fact, these opinions are so deeply felt that they could easily be misinterpreted as religious in nature.   But what is the history and evolution of these ideas and events.  What follows is a list of key milestones and  noteworthy people in the programming universe.

1800

Joseph Marie Jacquard teaches a loom to read punch cards, creating the first heavily multi-threaded processing unit. His invention was fiercely opposed by the silk-weavers who foresaw the birth of Skynet.

1842

Ada Lovelace gets bored of being noble and scribbles in a notebook what will later be known as the first published computer program, only slightly inconvenienced by the fact that there were no computers around at the time.

1936

Alan Turing invents everything, the British courts do not approve and have him chemically castrated.

The Queen later pardoned him, but unfortunately he had already been dead for centuries at that time.

1936

Alonzo Church also invents everything with Turing, but from across the pond and was not castrated by the Queen.

1957

John Backus creates FORTRAN which is the first language that real programmers use.

1959

Grace Hopper invents the first enterprise ready business oriented programming language and calls it the “common business-oriented language” or COBOL for short.

1964

John Kemeny and Thomas Kurtz decide programming is too hard and they need to go back to basics, they call their programming language BASIC.

1970

Niklaus Wirth makes Pascal become a thing along with a number of other languages, he likes making languages.

He also invents Wirth’s law which makes Moore’s law obsolete because software developers will write so bloated software that even mainframes cannot keep up. This will later be proven to be true with the invention of Electron.js.

1972

Dennis Ritchie got bored during work hours at Bell Labs so he decided to make C which had curly braces so it ended up being a huge success. Afterwards he added segmentation faults and other developer friendly features to aid productivity.

Still having a couple of hours remaining he and his buddies at Bell Labs decided to make an example program demonstrating C, they make a operating system called Unix.

1980

Alan Kay invents object oriented programming and calls it Smalltalk, in Smalltalk everything is an object, even an object is an object. No one really has time to understand the meaning of small talk.

1987

Larry Wall has a religious experience, becomes a preacher and makes Perl the doctrine.

1983

Jean Ichbiah notices that Ada Lovelace programs never actually ran and decided to create a language with her name but the language continues to be not run.

1986

Brac Box and Tol Move decide to make an unreadable version of C based on Smalltalk which they call Objective-C but no one is able to understand the syntax.

1983

Bjarne Stroustrup travels back to the future and notices that C is not taking enough time to compile, he adds every feature he can think of to the language and names it C++.

Programmers everywhere adopt it so they have genuine excuses to watch cat videos and read xkcd while working.

1991

Guido van Rossum does not like culy braces and invents Python, syntax choices were inspired by Monty Python and the Flying Circus.

1993

Roberto Ierusalimschy and friends decide they need a scripting language local to Brazil, during localization an error was made that made indices start counting from 1 instead of 0, they named it Lua.

1994

Rasmus Lerdorf makes a template engine for his personal homepage CGI scripts, he releases his dotfiles on the web.

The world decides to use these dotfiles for everything and in a frenzy Rasmus throws some extra database bindings in there for the heck of it and calls it PHP.

1995

Yukihiro Matsumoto is not very happy, he notices other programmers are not happy. He creates Ruby to make programmers happy. After creating Ruby “Matz” is happy, the Ruby community is happy, everyone is happy.

1995

Brendan Eich takes the weekend off to design a language that will be used to power every single web browser in the world power and eventually Skynet . He originally went to Netscape and said it was called LiveScript but Java became popular during the code review so they decided they better use curly braces and rename it to JavaScript.

Java turned out to be a trademark that would get them in trouble, JavaScript later gets renamed to ECMAScript and everyone still calls it JavaScript.

1996

James Gosling invents Java, the first truly overly verbose object oriented programming language where design patterns rule supreme over pragmatism.

Its super effective, the manager provider container provider service manager singleton manager provider pattern is born.

2001

Anders Hejlsberg re-invents Java and calls it C# because programming in C feels cooler than Java. Everyone loves this new version of Java for totally not being like Java.

2005

David Hanselmeyer Hansen creates a web framework called Ruby on Rails, people no longer remember that the two are separate things.

2006

John Resig writes a helper library for JavaScript, everyone thinks it’s a language and make careers of copy and pasting jQuery codes from the internets.

2009

Ken Thompson and Rob Pike decide to make a language like C, but with more safety equipment and more marketable and with Gophers as mascots.

They call it Go, make it open source and sell Gopher branded kneepads and helmets separately.

2010

Graydon Hoare also wants to make a language like C, he calls it Rust. Everyone demands that every single piece of software be rewritten in Rust immediately. Graydon wants shinier things and starts working on Swift for Apple.

2012

Anders Hjelsberg wants to write C# in web browsers, he designs TypeScript which is JavaScript but with more Java in it.

2013

Jeremy Ashkenas wants to be happy like Ruby developers so he creates CoffeeScript which compiles to be JavaScript but looks more like Ruby. Jeremy never became truly happy like Matz and Ruby developers.

2014

Chris Lattner makes Swift with the primary design goal of not being Objective-C, in the end it looks like Java.


 

 

 

Original article here.

Visual introduction to the Fourier Transform

One of my more difficult electrical engineering courses in college was on signals and communications.  I remember there being an emphasis on Fourier Transforms. I also remember being lost a lot.  I wish the demos were more visual like this video.

This is an animated introduction to the Fourier Transform, winding graphs around circles created by YouTube channel 3blue1brown.

If you are new to their channel and want to see more, a good place to start is this playlist.

 

Thomas Aquinas Against Your Digital-Age Anxiety

Arecent study found that “more Americans [are] suffering from stress, anxiety, and depression” than ever recorded. An article in Psychology Today described this increased anxiety as contributing to the fact that “between 1999 and 2014, the suicide rate increased [by] 24 percent.” Nistsu Abebe attempted to diagnose “America’s New ‘Anxiety’ Disorder” in a recent article for the New York Times Magazine. The usual suspects have been rounded up: We are anxious thanks to “social media,” thanks to “Trump,” thanks to the lingering effects of the “Great Recession,” and so on.

But before we gulp down another round of meds and launch into hazy rants lambasting Trump for our digital-age jitters, we ought to be clear on what, exactly, anxiety is. This is the vocation of philosophy in the modern age — to plod behind popular discourse and puzzle over the terms it casts away like so many cigarette ends on a beach.

Anxiety is a kind of fear. The medieval theologian Thomas Aquinas describes fear in his great textbook, the Summa Theologiae: Fear is a “passion of the soul” characterized by a bodily “contraction” that occurs whenever one “regards a future evil which surpasses the power of him that fears, so that it is irresistible.” Let’s break Aquinas’ definition into its parts.

1. Fear is a passion of the soul.

Fear is an emotion; a feeling; it belongs to that class of experiences that arise in response to our perception of this or that reality as significant, allowing it to pierce and trouble our psychological and bodily experience.

2. Characterized by a contraction.

Fear is a characterized by the bodily experience of “shrinking inwards.” One’s extremities grow cold, we tremble, pull back from danger, and so forth. Aquinas describes this in fantastically medieval fashion: “[W]e see the inhabitants of a city, when seized with fear, leave the outskirts, and, as far as possible, make for the inner quarters. It is in resemblance to this contraction…that in fear a similar contraction of heat and vital spirits towards the inner parts takes place in regard to the body.”

3. Fear regards a future evil.

The object of fear (or the specific significance of the reality that causes fear) is always an evil — the destruction of some good we hold dear. Aquinas takes pains to indicate that the object of fear is always imaginary. We do not fear the lack of our leg after a bear has bitten it off. We sorrow over it. But if we imagine that this or that bear will bite off our favorite leg, then we have fear — fear of the future loss of our leg. Fear is a future-facing emotion, a shrinking away from the imagined destruction of good things.

4. Fear regards a future evil as irresistible.

Resistible evils are not met with fear. If I know I can avoid failing an exam, I do not fear it, even though I may recognize it as a possible future evil. The sneaking suspicion that I am unprepared, unable to resist, doomed and damned to be squished under the oncoming Finals’ Week Express — this gives rise to fear.

Fear comes in many kinds. Laziness is a kind of fear — a shrinking away from the future evil of labor. This fear became fashion a few decades ago, when our antiheroes were characterized by laconic lethargy, a slow-moving “cool” that avoided the work of the “stiffs” and the “suits” — like Cool Hand Luke. Today our fictional heroes are more likely to be anxious — manic loners and twitching nerds, like the BBC’s Sherlock. Shame is another kind of fear — a felt contraction that occurs upon the threat of disgrace. This fear was respectable in Aquinas’ age — he called shame a quasi-virtue, held by those with a great sense of honor. Anxiety is our fear of choice, a mark of success in the digital age. As a New York article put it: “Panicked strivers have replaced sullen slackers as the caricatures of the moment, and Xanax has eclipsed Prozac as the emblem of the national mood.”

The editors of Wired recently took up the problem of our increased anxiety, rooted it in our transition to a digital age, and attempted to assuage us, arguing, “It’s natural to worry. But it isn’t helpful.” They set about restoring our peace by offering sane evaluations of the threats posed by cyberattacks, mechanized labor, identity theft, and so on. But anxiety, as Aquinas argues, is fear of the “unforeseen evil” — the irresistible possibility of disaster lurking on the horizon.

Unlike the straightforward fear of cyberattacks or charging bears, anxiety is usually puzzled by its own existence; the object of fear remains unforeseen. The anxious man is ill at ease; sick with the feeling that something — he knows not what — is about to go wrong. We should phrase the question in accordance with Aquinas’ description: What about the digital age makes man into an antenna nervously detecting obscure future evils on the horizon?

The digital age is one in which useful skills and property are gathered into the hands of the few and rented out to the many. We make our monthly payments with a potent combination of currencies: money, attention span, personal information, and an increasing inability to do without these selfsame digital conveniences.

No one likes to confess to a new serfdom, but consider: We exchange the owned skill of map reading for the rented convenience of GPS, paid for by ad space and suggestive selling. We exchange the owned good of sexual arousal for the rented convenience of internet porn. We exchange the owned skills of book reading, memory, writing, entertainment, and communication for apps, devices, and programs that we rent from a shrinking pantheon of monopolies — Facebook, Google, Amazon, Verizon, and Apple.

Increased rent always increases the possibility of unforeseen evils, because rent removes problems and solutions from our hands and places them in the hands of others. Most people do not own the skill of automobile repair — we rent it from someone who does. This eliminates a myriad of foreseen evils — having to do the work, buy the tools, build the garage, and so forth. It also increases the threat of unforeseen evils — like breaking down in the middle of a journey across Texas without a clue as to what’s under the front hood. For most car owners, this is an acceptable trade.

But the trade of foreseen evils for unforeseen evils has become our daily bread. We save our work and memories in the cloud, avoiding the foreseen inconveniences of physicality by taking on an added threat of disaster in the form of our “lives” being accidentally deleted. My credit card gets rid of the inconvenience of carrying cash by the technology of the chip — by increasing the threat of unforeseen disaster in the form of credit card hacking and technological failure. We trade the foreseen evils of maps, cameras, and a working knowledge of home repair for the sleek convenience of the smartphone — increasing the threat of unforeseen evil in the form of a malfunction that leaves us without navigation, communication, or vital information.

Whenever we begin to rent a previously owned product, service, or skill, we rid ourselves of the foreseen inconveniences of ownership and take on the statistically less probable threat that the corporations we are renting our products, services, and skills from will fail us — and we’ll be left in the dark. Our devices decrease a multitude of daily trials, toils, and labors in exchange for quantitative increase in possible, unforeseen evils — that is, the failure of our devices, which we are unable to fix, or the future decisions of their owners, which we are unable to predict.

We are becoming a generation of human beings who tackle fewer foreseen evils and fret over more unforeseen, disastrous possibilities than any previous generation.

The new man is an anxious man because the new man is a renter.

The typical explanations of modern anxiety are unified by the explanation of rent. Yes, we are anxious because of social media, but not simply because it presents us with horrible news stories and constant comparison. To use social media is to exchange the owned skill of communication for a rented product, relying on a myriad of developers, designers, and, increasingly, their memes and emoticons in order to hear and be heard. We exchange a whole host of foreseen evils associated with letter writing and face-to-face communication for a host of unforeseen evils, like hacking and breakdown — future evils that we hope Mark Zuckerberg can resist, because to us, they are as irresistible as they are unknown.

Yes, we are anxious, thanks to “economic insecurity,” but not because we are poor. We are poor with rented apartments, smartphones, Facebook pages, and Netflix accounts; poor in an age that “gets by” by paying more and more rent for previously owned skills and goods; poor in an age in which poverty rarely attains the dignity of thrift and self-reliance, but must scrounge up the cash for an internet connection and more cellphone minutes. This poverty is characterized by anxiety because it is characterized by an extreme dependence on earthly powers.

In “America’s New ‘Anxiety’ Disorder,” Abebe argues that we are anxious because our current political scene has revealed that “the life of the nation may not take place in a realm of issues and policy and consensus ​building, but someplace more disordered, irrational, and human.” But this is hardly a new insight for the American political participant. The difference is that, in an age of rent, we are more dependent on fewer and fewer decision-makers to make the right choices. When our daily life depends on the provision of large-scale services like electricity, data, and a connection to the cloud; when a business in Ohio needs the wealthy of Silicon Valley to remain prudent, wise, and successful in order to exist; when wealth is concentrated into the hands of a few owners on whom we all depend for our well-being, then our evils become unforeseen evils and we become anxious.

This is why all tech optimism, insofar as it promotes an exchange of ownership for rent, is wrong. There will be no happy future. There will be a worried, anxious population nervously checking their phones, suppressing a fear of looming disaster, their lives held in the hands of corporate monopolies that will, eventually, screw up. The current obsession with apocalyptic thinking in art, politics, and even business is no accident — it is simply what a certain accumulated threat level of disaster feels like to the human psyche.

Tech pessimism is often critiqued for being romantic — for inventing an idyllic past before the supposedly “dehumanizing” influence of smart technology. But a modest call to throw away our smartphones and pick up dumber technologies doesn’t promise an end to foreseen evils. If anything, it promises a grand increase of foreseen evils sparked by a increase of problems and solutions placed firmly in our hands — like maps, compasses, woodstoves, and pencils that need sharpening. A future without fear is an idle dream, but we can choose the kind of fear we will suffer. The wager of tech pessimism is that it is a far finer thing to fear the evils presented to us by goods, skills, and ventures that really are our own—finer than to fear the unforeseeable actions of the wealthy from whom we rent.

Original article here.

 

The History Of Intel CPUs (slideshow)

Intel Begins with The 4004

Picture 1 of 36

The first microprocessor sold by Intel was the four-bit 4004 in 1971. It was designed to work in conjunction with three other microchips, the 4001 ROM, 4002 RAM and the 4003 Shift Register. Whereas the 4004 itself performed calculations, those other components were critical to making the processor function. The 4004 was mostly used inside of calculators and similar devices, and it was not meant for use inside of computers. Its max clock speed was 740KHz.

The 4004 was followed by a similar processor known as the 4040, which was essentially an improved variation of the 4004 with an extended instruction set and higher performance.

See the entire history of Intel CPUs here.

 

Divorce and Occupation (interactive plot)

Some jobs tend towards higher divorce rates. Some towards lower.

As people are marrying later and staying single longer, divorce continues to be common in the United States. It’s not the mythical “half of marriages end in divorce” common, but the percentages are up there.

Divorce rates vary a lot by group though. Rates are higher for the unemployed than employed. Divorce among Asians tends to be much lower than other races. Rates change a lot by education level.

So, let’s look at divorce rates by occupation. Using data from the 2015 American Community Survey, for each occupation, I calculated the percentage of people who divorced out of those who married at least once.

Go here to see results. Each dot represents an occupation. Mouse over for details.

 

Original article here.

World’s largest instrument resides in a cave (video)

The world’s largest instrument exists deep inside the Luray Caverns in the Appalachian Mountains, and it’s 400 million years in the making, Scientific American reports. It’s called the Great Stalacpipe Organ, but it’s unlike a typical organ, which forces air through pipes to create music — this instrument rhythmically strikes the cave’s stalactites to create beautiful sounds.

How it works: Large, rubber mallets sit next to the 37 stalactites. When the organist strikes a key on the instrument, the corresponding mallet strikes the stalactites, all coordinated via a hidden mechanical device that receives electrical signals from the organ. The organ’s inventor sanded down 35 of the 37 stalactites to perfect their tone (two didn’t need it) and the 3.5-acre cave lends natural acoustics for the songs.

Because the acoustics are not uniform throughout the cavern, it can be difficult to play by ear. So an automated system — using a plastic sheet with holes in it that rotates around a metal drum — plays songs similar to a player piano. When the metal meets the drum, corresponding stalactites sound off.

Go deeper: A combination of Mother Nature’s work, time and a skilled mathematician helped create the Great Stalacpipe Organ. The large, echoey chambers of Luray Caverns have formed naturally over the past 400 millions years. Calcium-rich water droplets eventually formed the large stalactites that hang from the cave’s ceiling and now act as an integral part of this instrument. Back in 1954, mathematician and electronics engineer Leland Sprinkle visited the caverns. As was customary during these tours, the guide would strike the stalactites to show visitors how each one gave off a unique sound. That moment sparked an idea in Sprinkle’s head — create the Great Stalacpipe Organ, which he did in three years.

Original article here.