10 interesting stories served every morning and every evening.




1 1,045 shares, 45 trendiness, words and minutes reading time

Everything you missed over the last 10 years

JavaScript has come a long way since I knew it as the D” in DHTML. For any­one like me, who’s been re­luc­tant to use the lat­est syn­tax that could re­quire poly­fills or a tran­spiler, I’ve writ­ten this cheat­sheet to get you caught up on all the good­ness that’s widely sup­ported in mod­ern browsers.

I’ve made this page con­cise, with runnable ex­am­ples and links to fur­ther doc­u­men­ta­tion. If you have any ques­tions or spot any er­rata, please con­tact me.

Check out all these new built-in ar­ray func­tions! No more need for un­der­score or lo­dash!

These new key­words de­clare vari­ables in block scope (as op­posed to global or func­tion scope). Using const im­plies that the value will not change as the ref­er­ence is im­mutable. Use let if the value will change.

The ?? op­er­a­tor checks if the value is null or un­de­fined. No more need to use the !! check.

The ?. op­er­a­tor checks if the value is truthy be­fore call­ing the next prop­erty or func­tion. Extremely use­ful when deal­ing with op­tional props.

The async/​await key­words are here to save you from call­back hell. Use await to make an asyn­chro­nous call re­sem­ble a syn­chro­nous call, i.e. run­ning await fetchUser­Name() will not pro­ceed to the next line un­til fetchUser­Name() is com­plete. Note, in or­der to use await, you have to be ex­e­cut­ing a func­tion de­clared as async, i.e.

async func­tion fn(){ await fetchUser­Name() }.

These are func­tions that are bound to the cur­rent con­text. There are three main forms you’ll see in the wild:

sin­gle ar­gu­ment, sin­gle line, multi-line.

The sin­gle ar­gu­ment form does not re­quire paren­the­sis, and the sin­gle line form does not re­quire a re­turn state­ment; the re­turn is im­plicit.

The multi-line form re­quires a re­turn state­ment if the func­tion in­tends to re­turns some­thing. Multiple ar­gu­ments re­quire paren­the­sis.

Used for loop­ing over an it­er­a­tor. Similar to for…in ex­cept you don’t have to check for ha­sOwn­Prop­erty. You can­not use this loop­ing syn­tax on an Object di­rectly be­cause the Object does­n’t have an it­er­a­tor. Instead use Object.entries({}) to re­trieve an it­er­able.

Asynchronous it­er­a­tion was in­tro­duced In 2018. Much like Promise.all, it can be used to syn­chro­nize many asyn­chro­nous tasks. The ex­am­ple be­low shows 3 tasks hap­pen­ing asyn­chro­nously. The loop processes one re­sult at a time, in or­der; in this case, the quick­est tasks to com­plete are only ev­i­dent at the end of the it­er­a­tion.

for await…of docs

In 2015, ES6 brought classes to Javascript 🎉. Javascript classes are sim­i­lar to the classes you know and love from other lan­guages. Inheritance, class meth­ods, get­ters and set­ters, prop­er­ties, etc.

Get and set are func­tions that are called like prop­er­ties, i.e. per­son.age = 16; per­son.age > 18. These are very con­ve­nient when you need a dy­namic or com­puted prop­erty. And they can be used with both classes and reg­u­lar ob­jects.

Yay! You can now spec­ify de­fault pa­ra­me­ters in your func­tion de­f­i­n­i­tion. Works as you would ex­pect.

With a bit of ob­ject de­struc­tu­ing magic, func­tions can now have named pa­ra­me­ters.

The re­set pa­ra­me­ter al­lows a func­tion to ac­cept an ar­bi­trary num­ber of ar­gu­mentsas an ar­ray. It’s rec­om­mended to use this over ar­gu­ments.

Object.assign(target, source) merges two or more ob­jects into one. It mod­i­fies the tar­get ob­ject in-place, so if you’d pre­fer a new ob­ject be cre­ated, pass an empty ob­ject lit­eral as the first ar­gu­ment.

Alternatively, you can use the spread op­er­a­tor … to merge mul­ti­ple ob­jects to­gether: {…obj1, …obj2}, though bear in mind, spread will not call set­ters on the ob­ject, so to be the most portable, con­sider Object.assign. The spread op­er­a­tor can also be used on ar­rays as shown in the last code sam­ple.

Destructuring al­lows you to ex­tract val­ues from ob­jects and ar­rays through pat­terns. It is a com­plex topic with many ap­pli­ca­tions…far too many for me to enu­mer­ate, but I’ve shown some of the most com­mon uses I can think of.

Destructuring docs and MDN docs­func­tion f() {

re­turn [1, 2];

let [a, b] = f()

print(“a=“+a + b=” + b)

const obj = {state: {id: 1, is_ver­i­fied: false}}

const {id, is_ver­i­fied: ver­i­fied} = obj.state

print(“id = + id)

print(“ver­i­fied = + ver­i­fied)

for (const [key, value] of Object.entries({a: 1, b: 2, c: 3})) {

print(key + is + value);

Functions de­clared on ob­jects can use a new short­hand style that omits the func­tion key­word.

The two func­tions (fn1, fn2) are equiv­a­lent in the sam­ple be­low.

I’ve mostly skipped over promises be­cause async/​await is pre­ferred, but some­times you need to syn­chro­nize mul­ti­ple asyn­chro­nous calls, and Promise.all is the eas­i­est way to do it.

Also known as tem­plate strings, this new syn­tax pro­vides easy string in­ter­po­la­tion and multi-line strings.

A Proxy al­lows you to in­ter­cept get/​set calls on an­other ob­ject. This could be use­ful for watch­ing a prop­erty for changes, then up­dat­ing the DOM, or mak­ing in­no­v­a­tive APIs like the www proxy be­low.

Proxy doc­slet _nums = [1,2,3]

let nums = new Proxy(_nums, {

set(tar­get, key, value) {

tar­get[key] = value

print(“set called with + key + =” + value)

print(“up­date DOM)

re­turn true

nums.push(4)

print(“nums: + nums)

print(“_nums: + _nums)

Modules al­low you to name­space your code and break down func­tion­al­ity into smaller files. In the ex­am­ple be­low, we have a mod­ule named greet.js that gets in­cluded in in­dex.html. Note, mod­ule load­ing is al­ways de­ferred, so it won’t block the HTML from ren­der­ing. There are many ways to im­port/​ex­port func­tion­al­ity from js files, read more in the ex­port docs.

Okay, so I did­n’t cover every­thing that’s changed over the last decade, just the items I find most use­ful. Check out these other top­ics.

...

Read the original on turriate.com »

2 539 shares, 0 trendiness, words and minutes reading time

Neural implant lets paralyzed person type by imagining writing

Elon Musk’s Neuralink has been mak­ing waves on the tech­nol­ogy side of neural im­plants, but it has­n’t yet shown how we might ac­tu­ally use im­plants. For now, demon­strat­ing the promise of im­plants re­mains in the hands of the aca­d­e­mic com­mu­nity.

This week, the aca­d­e­mic com­mu­nity pro­vided a rather im­pres­sive ex­am­ple of the promise of neural im­plants. Using an im­plant, a par­a­lyzed in­di­vid­ual man­aged to type out roughly 90 char­ac­ters per minute sim­ply by imag­in­ing that he was writ­ing those char­ac­ters out by hand.

Previous at­tempts at pro­vid­ing typ­ing ca­pa­bil­i­ties to par­a­lyzed peo­ple via im­plants have in­volved giv­ing sub­jects a vir­tual key­board and let­ting them ma­neu­ver a cur­sor with their mind. The process is ef­fec­tive but slow, and it re­quires the user’s full at­ten­tion, as the sub­ject has to track the progress of the cur­sor and de­ter­mine when to per­form the equiv­a­lent of a key press. It also re­quires the user to spend the time to learn how to con­trol the sys­tem.

But there are other pos­si­ble routes to get­ting char­ac­ters out of the brain and onto the page. Somewhere in our writ­ing thought process, we form the in­ten­tion of us­ing a spe­cific char­ac­ter, and us­ing an im­plant to track this in­ten­tion could po­ten­tially work. Unfortunately, the process is not es­pe­cially well un­der­stood.

Downstream of that in­ten­tion, a de­ci­sion is trans­mit­ted to the mo­tor cor­tex, where it’s trans­lated into ac­tions. Again, there’s an in­tent stage, where the mo­tor cor­tex de­ter­mines it will form the let­ter (by typ­ing or writ­ing, for ex­am­ple), which is then trans­lated into the spe­cific mus­cle mo­tions re­quired to per­form the ac­tion. These processes are much bet­ter un­der­stood, and they’re what the re­search team tar­geted for their new work.

Specifically, the re­searchers placed two im­plants in the pre­mo­tor cor­tex of a par­a­lyzed per­son. This area is thought to be in­volved in form­ing the in­ten­tions to per­form move­ments. Catching these in­ten­tions is much more likely to pro­duce a clear sig­nal than catch­ing the move­ments them­selves, which are likely to be com­plex (any move­ment in­volves mul­ti­ple mus­cles) and de­pend on con­text (where your hand is rel­a­tive to the page you’re writ­ing on, etc.).

With the im­plants in the right place, the re­searchers asked the par­tic­i­pant to imag­ine writ­ing let­ters on a page and recorded the neural ac­tiv­ity as he did so.

Altogether, there were roughly 200 elec­trodes in the par­tic­i­pan­t’s pre­mo­tor cor­tex. Not all of them were in­for­ma­tive for let­ter-writ­ing. But for those that were, the au­thors per­formed a prin­ci­pal com­po­nent analy­sis, which iden­ti­fied the fea­tures of the neural record­ings that dif­fered the most when var­i­ous let­ters were imag­ined. Converting these record­ings into a two-di­men­sional plot, it was ob­vi­ous that the ac­tiv­ity seen when writ­ing a sin­gle char­ac­ter al­ways clus­tered to­gether. And phys­i­cally sim­i­lar char­ac­ters—p and b, for ex­am­ple, or h, n, and r—formed clus­ters near each other.

Overall, the re­searchers found they could de­ci­pher the ap­pro­pri­ate char­ac­ter with an ac­cu­racy of a bit over 94 per­cent, but the sys­tem re­quired a rel­a­tively slow analy­sis af­ter the neural data was recorded. To get things work­ing in real time, the re­searchers trained a re­cur­rent neural net­work to es­ti­mate the prob­a­bil­ity of a sig­nal cor­re­spond­ing to each let­ter.

Despite work­ing with a rel­a­tively small amount of data (only 242 sen­tences’ worth of char­ac­ters), the sys­tem worked re­mark­ably well. The lag be­tween the thought and a char­ac­ter ap­pear­ing on screen was only about half a sec­ond, and the par­tic­i­pant was able to pro­duce about 90 char­ac­ters per minute, eas­ily top­ping the pre­vi­ous record for im­plant-dri­ven typ­ing, which was about 25 char­ac­ters per minute. The raw er­ror rate was only about 5 per­cent, and ap­ply­ing a sys­tem like a typ­ing au­to­cor­rect could drop the er­ror rate down to only 1 per­cent.

The tests were all done with pre­pared sen­tences. Once the sys­tem was val­i­dated, how­ever, the re­searchers asked the par­tic­i­pant to type out free-form an­swers to ques­tions. Here, the speed went down a bit (to 75 char­ac­ters a minute) and er­rors went up to 2 per­cent af­ter au­to­cor­rec­tion, but the sys­tem still worked.

As the re­searchers them­selves put it, this is not yet a com­plete, clin­i­cally vi­able sys­tem.” To be­gin with, it has only been used in a sin­gle in­di­vid­ual, so we have no idea how well it might work for oth­ers. The sim­pli­fied al­pha­bet used here does­n’t con­tain any dig­its, cap­i­tal let­ters, or most forms of punc­tu­a­tion. And the be­hav­ior of the im­plants changes over time, per­haps be­cause of mi­nor shifts rel­a­tive to the neu­rons they read or the build-up of scar tis­sue, so the sys­tem had to be re­cal­i­brated reg­u­larly—at least once per week to main­tain a tol­er­a­ble er­ror rate.

That said, the sys­tem shows a very sig­nif­i­cant speed boost com­pared to pre­vi­ous im­plant-dri­ven sys­tems, and the ac­cu­racy is quite good. The sys­tem also has the po­ten­tial to be sim­i­lar to touch-typ­ing, in that a user does­n’t have to ac­tu­ally vi­su­ally fo­cus on let­ter pro­duc­tion, al­low­ing more nor­mal in­ter­ac­tions with the user’s sur­round­ings. The let­ter is­sue might be solved in part by us­ing an al­ter­nate al­pha­bet de­signed by the re­searchers, in which all the let­ters are de­fined by dis­sim­i­lar pat­terns of strokes. There’s a lot of po­ten­tial here.

The ex­per­i­ments also pro­vide a re­minder of the po­ten­tial of these im­plants more gen­er­ally and why com­pa­nies might start find­ing the tech­nol­ogy worth com­mer­cial­iz­ing.

...

Read the original on arstechnica.com »

3 361 shares, 49 trendiness, words and minutes reading time

Observing my cellphone switch towers

One of my fa­vorite books is 2013 High Performance Browser Networking” by Ilya Grigorik. Besides a wealth of ac­tion­able ad­vice, the book is il­lus­trated with cap­ti­vat­ing real life sto­ries.

46% of Battery Consumption to Transfer 0.2% of Total Bytes

Whenever a Pandora user plays a song, the en­tire mu­sic file is streamed by the ap­pli­ca­tion from the net­work in one shot, which is the cor­rect be­hav­ior: burst as much data

as you can, then turn off the ra­dio for as long as pos­si­ble.

However, fol­low­ing the mu­sic

trans­fer, the ap­pli­ca­tion would con­duct pe­ri­odic au­di­ence mea­sure­ments by send­ing

in­ter­mit­tent an­a­lyt­ics pings every 60 sec­onds. The net ef­fect? The an­a­lyt­ics bea­cons

ac­counted for 0.2% of the to­tal trans­ferred bytes and 46% of the to­tal power con­sump­tion of the ap­pli­ca­tion!

Ilya takes the time to go deep to make his points across. To en­lighten read­ers on the topic of cell­phone bat­tery life, he ded­i­cates a whole chap­ter to de­tail the GSM, UMTS, and LTE ra­dio mo­dem. It is fas­ci­nat­ing to re­al­ize that prob­lems at one level can find their roots sev­eral lay­ers be­low.

By ex­plor­ing the whole stack, High Performance Browser Networking does more than pro­vid­ing facts. It ad­vo­cates a phi­los­o­phy.

Good de­vel­op­ers know how things work.

Great de­vel­op­ers know why things work.

An old idea is new again

Back when I read it, in 2013, I thought it would have been cool to do my own ex­plo­ration and vi­su­al­ize how the ra­dio jumped from one cell to an­other while the phone trav­eled.

The idea was not doable with my 2013 iOS phone since it did not ex­pose the data I needed but my cur­rent Pixel does not have this is­sue. LocationManager can pro­vide a GPS lo­ca­tion (lat,long) every sec­ond. Meanwhile, TelephonyManager gives the cel­lID=(mmc,mcc,lac,cid) the ra­dio is cur­rently camp­ing on.

A cel­lID data­base[1], al­lows to know the (lat,long) of each CellID. What is left is to draw the itin­er­ary (in red) and, for each sec­ond, a cel­lID-color-coded con­nec­tion to the cell.

A drive from Sunnyvale to down­town Mountain View.

The re­sult above shows a 7 min­utes drive cov­er­ing 2.3 miles (3.7 km) with an LTE ca­pa­ble phone (a.k.a UE for User Equipment). Along the way, five tow­ers and nine cells (a.k.a eNB for Evolved NodeB) were en­coun­tered.

Combining the map, Google StreetView, and Wikipedia al­lowed to un­der­stand a lot of things.

- Several cel­lIDs map to the same eNB lat/​long co­or­di­nates. That’s be­cause the an­ten­nas mounted on an eNB don’t have 360° cov­er­age. The an­gle and range of each an­tenna carves the space into pizza slice shaped cells.

- Antennas are po­si­tioned and ori­ented strate­gi­cally. In the map on the right, tow­ers are posted along high­way 85 and an­ten­nas pointed par­al­lel to it. Some an­ten­nas seem to have ex­cep­tion­ally nar­row and long range. Possibly to ac­com­mo­date the high den­sity dur­ing traf­fic jams.

- eNBs have a much higher den­sity than I thought. Googling about cellphone tower range” re­turned a 45 miles fig­ure. That may be true in rural ar­eas but in a city, pop­u­la­tion den­sity and eNB den­sity are cor­re­lated. That means there were tow­ers every mile in Sunnyvale.

- Sites are not nec­es­sar­ily shared among op­er­a­tors. The ac­cu­racy of the CellID data­base (CellMapper) is so high that I was able to go on Google StreetView and see the ac­tual tow­ers. I ex­pected to see huge mono­liths with large ar­rays of an­tenna for each op­er­a­tors but most of the time it looked like a sin­gle one was there.

- eNB an­ten­nas can be found on many things be­sides masts[2]. Some of the lo­ca­tions in­clude churches[3], elec­tric py­lons[4], and even com­mer­cial build­ings.

- Once you are in the habit of look­ing for them, these once in­vis­i­ble cell tow­ers be­come im­pos­si­ble to ig­nore.

- The UEs LTE ra­dio is able to jump from cells back and forth. Several times within a minute seems to be a com­mon oc­cur­rence within a city to pal­li­ate to build­ing ob­struc­tion.

- Tower pair­ing (a.k.a camp­ing) looks de­ter­min­is­tic. In the two pre­vi­ous maps, the tower us­age looks sim­i­lar in the shared por­tion of the trip. The se­lec­tion hap­pens ac­cord­ing to a state-ma­chine con­fig­ured by each cell via broad­cast SIB mes­sages. The state tran­si­tion hap­pens based on mul­ti­ple fac­tors such a pre­vi­ous cell sig­nal strength thresh­old or next cell sig­nal strength thresh­old.

- On a long” (10 miles) dri­ving ses­sion I saw that the LAC (Location Area Code) part of the CellID re­mained the same. According to the LTE specs, cell-tow­ers don’t have to per­form UE hand-overs like in GSM/UMTS. The phone starts camp­ing on the next tower while re­main­ing in RCC_IDLE mode with­out emit­ting data. Not only does this save bat­tery, it also means op­er­a­tors don’t re­ally know where the phone is as long as it re­mains in the same LAC. If data must be sent, all tow­ers in the same LAC must ping the phone. I may mean LTE of­fers greater pri­vacy al­though this topic seem to have been de­bated ever since GSM[5].

- Each tower seems to use three 120° an­ten­nas. It is pretty ob­vi­ous when cir­cling around one.

Traveling around a tower re­veals the 120° ra­dius of each cells.

Further down the rab­bit hole

Drawing maps was fun. It made me want to learn more about the field. I found it to not only be deep but also quite broad. Even draw­ing a min­i­mal table to sum­ma­rize it re­quired a sub­stan­tial amount of acronym re­search.

Starting in 1998 with 2G (GSM), all tech-stacks were stan­dard­ized and doc­u­mented by 3GPP. These specs span over hun­dreds of doc­u­ments. Understanding them seems like a life­time achieve­ment.

There are no open source LTE stack to learn from and even if there was, emit­ting on cell­phone bands is highly reg­u­lated in or­der to make sure fre­quen­cies are not pol­luted with buggy modems.

The few books in the fields are very ex­pen­sive. My genuine win­dow of in­ter­est” was fu­eled by these three.

An in­tro­duc­tion to LTE by Christopher Cox.

Finally, there are apps that al­low to peek un­der the hood to show the mo­dem state and mes­sages. I elected not to use them since not only they are ex­pen­sive, they also re­quire to root the phone.

...

Read the original on fabiensanglard.net »

4 316 shares, 16 trendiness, words and minutes reading time

The True Size of Africa

A few years back there was an ex­hi­bi­tion in a London gallery by the Royal Geographic Society, and the cu­ra­tor asked the edge.org group to con­tribute unusual maps”. Thinking it would be for a few hun­dred peo­ple at most, I put to­gether a lit­tle map that I had made pre­vi­ously in the mid 80s be­fore, then as an ex­am­ple of sci­en­tific vi­su­al­iza­tion graph­ics soft­ware (which I spent a decade on, ac­tu­ally).

It was a very sim­ple premise that I had seen done a num­ber of times be­fore - never claimed it to be a novel in­ven­tion - but had a slightly new twist in mind: Africa is so mind-numb­ingly im­mense, that it ex­ceeds the com­mon as­sump­tions by just about any­one I ever met: it con­tains the en­tirety of the USA, all of China, India, as well as Japan and pretty much all of Europe as well - all com­bined !

And the idea was to roughly put all of them as puz­zle pieces some­how fit­ting in­side the out­line shape of Africa, which is of course just a sym­bolic im­age - it may as well have been just blobs to tell the story, but it ac­tu­ally worked pretty well with the real pieces, at least enough to get the idea across in a vi­sual and vis­ceral way:

Just for ref­er­ence:

Surface of the Moon

During the mak­ing of it, I sent it to a few friends for some feed­back, but then next thing I knew Stephen Fry had tweeted it and within lit­er­ally days there were tens of thou­sands of re­blogs all over the place, a lit­tle vi­ral meme…

Searching for that ex­act phrase got over half a mil­lion re­sponses in 2010:

On the right here you can see each of the main 5 swal­lowed up in­side the land­mass of Africa, one at a time to make it clearer…

And be­low that you can see an ex­act list of an­other set of coun­tries by area, adding up to less than Africa as well…

The whole point be­ing made was that we all have been taught ge­og­ra­phy mainly based on the Mercator pro­jec­tion - as the back­ground in daily tele­vi­sion news, the cover of my school at­las, in gen­eral the ubiq­ui­tous de­pic­tion of the planet.

But the ba­sic fact is that a three-di­men­sional sphere be­ing shown as a sin­gle two-di­men­sional flat im­age will al­ways be sub­ject to a con­ver­sion loss: some­thing has to give…

The rea­son why Mercator was such an im­por­tant ad­vance is sim­ple: on it one can draw straight lines to ac­count for travel routes - in the days of the gi­gan­tic mer­chant fleets and naval bat­tles an im­mensely valu­able at­tribute.

But that abil­ity to use lines in­stead of curves came at a cost: ar­eas near the poles would be greatly ex­ag­ger­ated. Greenland looks de­ceiv­ingly as if it were the size of all of South America for in­stance…

In other words: if things are nor­mal near the equa­tor, every­thing fur­ther north and south is fa­mil­iar to us in a stretched and en­larged ver­sion, veer­ing fur­ther and fur­ther away from the proper size. And con­versely: if we kept the shapes as we in­tu­itively know them now, Africa ought to be stretched mas­sively larger to keep it in true pro­por­tion.

Hence the fact that in every­day think­ing, Africa is just about al­ways hugely un­der­es­ti­mated - even by col­lege grads, off by fac­tor of 2 or 3.

The table at right shows the US in­clud­ing Alaska and Hawaii, btw. And while not list­ing Eastern Europe (dark blue in the map) it adds many oth­ers un­used in the map: Mexico, Peru, New Guinea, New Zealand, Nepal and Bangladesh!

All of that be­came the fod­der for the old cliché:

Are you com­ing to bed, honey?”

No not yet -there is some­one wrong on the in­ter­net!”

A ver­i­ta­ble shit­storm of re­sponses latched onto the tini­est of tiny de­tails. People com­plained you missed Ibiza”, how could you make Belgium the same color as the Netherlands” and on and on and on…

Some oth­ers were dead cer­tain to have spot­ted the big flaw the UK is not the same size as Madagascar - this dude is SOOO wrong!”… and well DUH… that is PRECISELY the point I was mak­ing, in ac­tion: Madagascar is much larger!

Had I stretched things to truly prop­erly show that though, then the whole shape of Africa would have looked very awk­wardly elon­gated. I chose to tell the ba­sic puz­zle piece fit­ting story by us­ing the fa­mil­iar shapes.

Both to­gether can­not be done - but one could prob­a­bly do a much nicer job of an ex­act list of the in­gre­di­ents and then a nearly ex­act fit of the pieces - or maybe pouring the pix­els” highly ac­cu­rately in pro­por­tion into the out­line shape of Africa as the ves­sel to con­tain them all - but af­ter­wards each coun­try is just a layer of color - I pre­ferred the out­lines, even if rather rough and sym­bolic.

Hopefully some­one will im­prove on it - as mine was by far not the first and should also not be the last at­tempt to tell the story :)

But many to­tally missed the sin­gle big point: NO - this was not at all an at­tempt to cre­ate an ac­cu­rate map”, it was merely a sim­ple graph­i­cal de­pic­tion of the state­ment: Africa is just im­mense - much, much larger than you or I thought. Just look at it, re­al­ize that, and smile - be­cause you will never for­get it again :)

And: here is to Africa achiev­ing the stature that it de­serves to have…

...

Read the original on kai.sub.blue »

5 251 shares, 11 trendiness, words and minutes reading time

How this woman scammed the world, then vanished

He takes the first one on the list and looks it up on the Companies House web­site. Everything is meant to be trans­par­ent - the web­site con­tains the de­tails of every com­pany in the UK. It’s thought to be a key anti-cor­rup­tion tool. We are very proud of this in this coun­try,” he says. The prob­lem is that when you cre­ate this com­pany, no-one checks any of the in­for­ma­tion pro­vided.” He clicks to see the com­pa­ny’s fil­ing his­tory, but where you should see com­pany ac­counts, there is noth­ing. This is clas­sic,” he ex­claims. Look, noth­ing has hap­pened. They have filed no fi­nan­cial in­for­ma­tion at all.” Then he tries check­ing the com­pa­ny’s own­ers. The UK be­gan to in­sist re­cently that com­pa­nies must en­ter the name of the per­son with significant con­trol” - the real owner.

...

Read the original on www.bbc.com »

6 243 shares, 19 trendiness, words and minutes reading time

I Have a Lot to Say About Signal’s Cellebrite Hack

This blog post is based off of a talk I gave on May 12, 2021 at the Stanford Computer Science Department’s weekly lunch talk se­ries on com­puter se­cu­rity top­ics. Full dis­clo­sure: I’ve done some con­sult­ing work for Signal, al­beit not on any­thing like this is­sue. (I kinda doubt they’ll hire me again if they read this, though.)

You may have seen a story in the news re­cently about vul­ner­a­bil­i­ties dis­cov­ered in the dig­i­tal foren­sics tool made by Israeli firm Cellebrite. Cellebrite’s soft­ware ex­tracts data from mo­bile de­vices and gen­er­ates a re­port about the ex­trac­tion. It’s pop­u­lar with law en­force­ment agen­cies as a tool for gath­er­ing dig­i­tal ev­i­dence from smart­phones in their cus­tody.

In April, the team be­hind the pop­u­lar end-to-end en­crypted (E2EE) chat app Signal pub­lished a blog post de­tail­ing how they had ob­tained a Cellebrite de­vice, an­a­lyzed the soft­ware, and found vul­ner­a­bil­i­ties that would al­low for ar­bi­trary code ex­e­cu­tion by a de­vice that’s be­ing scanned with a Cellebrite tool.

As cov­er­age of the blog post pointed out, the vul­ner­a­bil­ity draws into ques­tion whether Cellebrite’s tools are re­li­able in crim­i­nal pros­e­cu­tions af­ter all. While Cellebrite has since taken steps to mit­i­gate the vul­ner­a­bil­ity, there’s al­ready been a mo­tion for a new trial filed in at least one crim­i­nal case on the ba­sis of Signal’s blog post.

Is that mo­tion likely to suc­ceed? What will be the likely ram­i­fi­ca­tions of Signal’s dis­cov­ery in court cases? I think the im­pact on ex­ist­ing cases will be neg­li­gi­ble, but that Signal has made an im­por­tant point that may help push the mo­bile de­vice foren­sics in­dus­try to­wards greater ac­count­abil­ity for their of­ten sloppy prod­uct se­cu­rity. Nevertheless, I have a raised eye­brow for Signal here too.

Cellebrite is an Israeli com­pany that, per Signal’s blog post, makes soft­ware to au­to­mate phys­i­cally ex­tract­ing and in­dex­ing data from mo­bile de­vices.” A com­mon use case here in the U. S. is to be used by law en­force­ment in crim­i­nal in­ves­ti­ga­tions, typ­i­cally with a war­rant un­der the Fourth Amendment that al­lows them to search some­one’s phone and seize data from it.

Cellebrite’s prod­ucts are part of the in­dus­try of mobile de­vice foren­sics” tools. The mo­bile foren­sics process aims to re­cover dig­i­tal ev­i­dence or rel­e­vant data from a mo­bile de­vice in a way that will pre­serve the ev­i­dence in a foren­si­cally sound con­di­tion,” us­ing ac­cepted meth­ods, so that it can later be pre­sented in court.

Who are their cus­tomers?

Between Cellebrite and the other ven­dors in the in­dus­try of mo­bile de­vice foren­sics tools, there are over two thou­sand law en­force­ment agen­cies across the coun­try that have such tools — in­clud­ing 49 of the 50 biggest cities in the U. S. Plus, ICE has con­tracts with Cellebrite worth tens of mil­lions of dol­lars.

But Cellebrite has lots of cus­tomers be­sides U. S. law en­force­ment agen­cies. And some of them aren’t so nice. As Signal’s blog post notes, Their cus­tomer list has in­cluded au­thor­i­tar­ian regimes in Belarus, Russia, Venezuela, and China; death squads in Bangladesh; mil­i­tary jun­tas in Myanmar; and those seek­ing to abuse and op­press in Turkey, UAE, and else­where.”

The ven­dors of these kinds of tools love to get up on their high horse and talk about how they’re the good guys,” they help keep the world safe from crim­i­nals and ter­ror­ists. Yes, sure, fine. But a lot of ven­dors in this in­dus­try, the in­dus­try of sell­ing sur­veil­lance tech­nolo­gies to gov­ern­ments, sell not only to the U. S. and other coun­tries that re­spect the rule of law, but also to re­pres­sive gov­ern­ments that per­se­cute their own peo­ple, where the de­f­i­n­i­tion of criminal” might just mean be­ing gay or crit­i­ciz­ing the gov­ern­ment. The will­ing­ness of com­pa­nies like Cellebrite to sell to un­sa­vory gov­ern­ments is why there have been calls from hu­man rights lead­ers and groups for a global mora­to­rium on sell­ing these sorts of sur­veil­lance tools to gov­ern­ments.

What do Cellebrite’s prod­ucts do?

Cellebrite has a few dif­fer­ent prod­ucts, but as rel­e­vant here, there’s a two-part sys­tem in play: the first part, called UFED (which stands for Universal Forensic Extraction Device), ex­tracts the data from a mo­bile de­vice and backs it up to a Windows PC, and the sec­ond part, called Physical Analyzer, parses and in­dexes the data so it’s search­able. So, take the raw data out, then turn it into some­thing use­ful for the user, all in a foren­si­cally sound man­ner.

As Signal’s blog post ex­plains, this two-part sys­tem re­quires phys­i­cal ac­cess to the phone; these aren’t tools for re­motely ac­cess­ing some­one’s phone. And the kind of ex­trac­tion (a logical ex­trac­tion”) at is­sue here re­quires the de­vice to be un­locked and open. (A log­i­cal ex­trac­tion is quicker and eas­ier, but also more lim­ited, than the deeper but more chal­leng­ing type of ex­trac­tion, a physical ex­trac­tion,” which can work on locked de­vices, though not with 100% re­li­a­bil­ity. Plus, log­i­cal ex­trac­tions won’t re­cover deleted or hid­den files, un­like phys­i­cal ex­trac­tions.) As the blog post says, think of it this way: if some­one is phys­i­cally hold­ing your un­locked de­vice in their hands, they could open what­ever apps they would like and take screen­shots of every­thing in them to save and go over later. Cellebrite es­sen­tially au­to­mates that process for some­one hold­ing your de­vice in their hands.”

Plus, un­like some cop tak­ing screen­shots, a log­i­cal data ex­trac­tion pre­serves the re­cov­ered data in its orig­i­nal state with foren­si­cally-sound in­tegrity ad­mis­si­ble in a court of law.” Why show that the data were ex­tracted and pre­served with­out al­ter­ing any­thing? Because that’s what is nec­es­sary to sat­isfy the rules for ad­mit­ting ev­i­dence in court. U. S. courts have rules in place to en­sure that the ev­i­dence that is pre­sented is re­li­able — you don’t want to con­vict or ac­quit some­body on the ba­sis of, say, a file whose con­tents or meta­data got cor­rupted. Cellebrite holds it­self out as meet­ing the stan­dards that U.S. courts re­quire for dig­i­tal foren­sics.

But what Signal showed is that Cellebrite tools ac­tu­ally have re­ally shoddy se­cu­rity that could, un­less the prob­lem is fixed, al­low al­ter­ation of data in the re­ports the soft­ware gen­er­ates when it an­a­lyzes phones. Demonstrating flaws in the Cellebrite sys­tem calls into ques­tion the in­tegrity and re­li­a­bil­ity of the data ex­tracted and of the re­ports gen­er­ated about the ex­trac­tion.

That un­der­mines the en­tire rea­son for these tools’ ex­is­tence: com­pil­ing dig­i­tal ev­i­dence that is sound enough to be ad­mit­ted and re­lied upon in court cases.

What was the hack?

As back­ground: Late last year, Cellebrite an­nounced that one of their tools (the Physical Analyzer tool) could be used to ex­tract Signal data from un­locked Android phones. Signal was­n’t pleased.

Apparently in re­tal­i­a­tion, Signal struck back. As last mon­th’s blog post de­tails, Signal cre­ator Moxie Marlinspike and his team ob­tained a Cellebrite kit (they’re coy about how they got it), an­a­lyzed the soft­ware, and found vul­ner­a­bil­i­ties that would al­low for ar­bi­trary code ex­e­cu­tion by a de­vice that’s be­ing scanned with a Cellebrite tool. According to the blog post:

Looking at both UFED and Physical Analyzer, … we were sur­prised to find that very lit­tle care seems to have been given to Cellebrite’s own soft­ware se­cu­rity. Industry-standard ex­ploit mit­i­ga­tion de­fenses are miss­ing, and many op­por­tu­ni­ties for ex­ploita­tion are pre­sent. …

[W]e found that it’s pos­si­ble to ex­e­cute ar­bi­trary code on a Cellebrite ma­chine sim­ply by in­clud­ing a spe­cially for­mat­ted but oth­er­wise in­nocu­ous file in any app on a de­vice that is sub­se­quently plugged into Cellebrite and scanned. There are vir­tu­ally no lim­its on the code that can be ex­e­cuted.

For ex­am­ple, by in­clud­ing a spe­cially for­mat­ted but oth­er­wise in­nocu­ous file in an app on a de­vice that is then scanned by Cellebrite, it’s pos­si­ble to ex­e­cute code that mod­i­fies not just the Cellebrite re­port be­ing cre­ated in that scan, but also from all pre­vi­ously scanned de­vices and all fu­ture scanned de­vices in any ar­bi­trary way (inserting or re­mov­ing text, email, pho­tos, con­tacts, files, or any other data), with no de­tectable time­stamp changes or check­sum fail­ures. This could even be done at ran­dom, and would se­ri­ously call the data in­tegrity of Cellebrite’s re­ports into ques­tion.

Signal also cre­ated a video demo to show their proof of con­cept (PoC), which you can watch in the blog post or their tweet about it. They sum­ma­rized what’s de­picted in the video:

[This] is a sam­ple video of an ex­ploit for UFED (similar ex­ploits ex­ist for Physical Analyzer). In the video, UFED hits a file that ex­e­cutes ar­bi­trary code on the Cellebrite ma­chine. This ex­ploit pay­load uses the MessageBox Windows API to dis­play a di­a­log with a mes­sage in it. This is for demon­stra­tion pur­poses; it’s pos­si­ble to ex­e­cute any code, and a real ex­ploit pay­load would likely seek to un­de­tectably al­ter pre­vi­ous re­ports, com­pro­mise the in­tegrity of fu­ture re­ports (perhaps at ran­dom!), or ex­fil­trate data from the Cellebrite ma­chine.

What did Signal say they’re go­ing to do about this?

The blog post an­nounced that go­ing for­ward, in the fu­ture, the Signal app will add aesthetically pleas­ing” files, pe­ri­od­i­cally and at ran­dom, to Signal’s app data caches on Signal users’ phones. Here’s the last para­graph of the blog post:

In com­pletely un­re­lated news, up­com­ing ver­sions of Signal will be pe­ri­od­i­cally fetch­ing files to place in app stor­age. These files are never used for any­thing in­side Signal and never in­ter­act with Signal soft­ware or data, but they look nice, and aes­thet­ics are im­por­tant in soft­ware. Files will only be re­turned for ac­counts that have been ac­tive in­stalls for some time al­ready, and only prob­a­bilis­ti­cally in low per­cent­ages based on phone num­ber shard­ing. We have a few dif­fer­ent ver­sions of files that we think are aes­thet­i­cally pleas­ing, and will it­er­ate through those slowly over time. There is no other sig­nif­i­cance to these files.

What ex­actly does that mean? Only Moxie and his team know. The rest of us are left to guess. I lit­er­ally had a re­porter tell me that they could­n’t tell if this part of the blog post was a joke or not.

One in­ter­pre­ta­tion is that aesthetically pleas­ing” means they’re im­age files — like, pic­tures of cats or some­thing — that the Signal user never ac­tu­ally sees and did not ac­tively put in app stor­age them­selves. Another in­ter­pre­ta­tion, if we as­sume those aesthetically pleas­ing” files do what a real ex­ploit pay­load” could do, then (absent a mit­i­ga­tion by Cellebrite) these files could af­fect a Cellebrite ma­chine if that phone got an­a­lyzed with a Cellebrite tool while those files were in app stor­age.

If noth­ing else, it means that if they fol­low through on what they say they’ll do, then Signal will add noise to the, uh, sig­nal in the Signal ap­p’s lo­cal stor­age on some users’ phones. But only some users, and Signal won’t know which users, and the files will change pe­ri­od­i­cally, if they’re there at all. It won’t be the case that all users of Signal will have the same files added by Signal into lo­cal stor­age at all times go­ing for­ward.

What did Signal sug­gest Cellebrite should do about this po­ten­tial ex­ploit?

Here’s what Signal sug­gested Cellebrite should do:

Any app could con­tain such a file [i.e. a booby-trapped file], and un­til Cellebrite is able to ac­cu­rately re­pair all vul­ner­a­bil­i­ties in its soft­ware with ex­tremely high con­fi­dence, the only rem­edy a Cellebrite user has is to not scan de­vices. Cellebrite could re­duce the risk to their users by up­dat­ing their soft­ware to stop scan­ning apps it con­sid­ers high risk for these types of data in­tegrity prob­lems, but even that is no guar­an­tee.

Basically, what they’re say­ing is: We’re go­ing to screw with you for adding sup­port to Cellebrite for Signal data. If you want to be sure of your own data in­tegrity, your users (the cops) should stop scan­ning phones that have Signal in­stalled. But even then, you can’t re­ally be sure, be­cause the apps that you or law en­force­ment deem high-risk might not be the ones poi­son­ing your ma­chines. The only way to be sure is for your users (the cops) to stop do­ing the one thing that your tools are made to do, which ul­ti­mately could put you out of busi­ness.”

Signal went on, We are of course will­ing to re­spon­si­bly dis­close the spe­cific vul­ner­a­bil­i­ties we know about to Cellebrite if they do the same for all the vul­ner­a­bil­i­ties they use in their phys­i­cal ex­trac­tion and other ser­vices to their re­spec­tive ven­dors, now and in the fu­ture.”

Basically, I’ll show you mine if you show me yours.” That is not gen­er­ally how vul­ner­a­bil­ity dis­clo­sure works, and AFAIK, Cellebrite has not taken them up on the of­fer so far.

By the way, this is­n’t the first time Cellebrite’s been outed for hav­ing shoddy se­cu­rity. In 2017, a hacker hacked Cellebrite’s servers and obtained 900 GB of data re­lated to Cellebrite,” in­clud­ing (1) Cellebrite cus­tomers’ user­names and pass­words for log­ging into its web­sites; (2) a vast amount of tech­ni­cal data re­gard­ing Cellebrite’s prod­ucts”; and even (3) what ap­pear[ed] to be ev­i­dence files from seized mo­bile phones, and logs from Cellebrite de­vices.”

What was Cellebrite’s ac­tual re­sponse to the hack?

According to Vice, a few days af­ter the blog post, Cellebrite pushed an up­date to its cus­tomers … limit[ing] what prod­ucts can per­form a log­i­cal iOS ex­trac­tion.” The com­pany did­n’t ad­mit whether the vuln was the one Signal de­scribed. (But ba­si­cally every­body as­sumes that’s the case.) Cellebrite did say, Based on our re­views, we have not found any in­stance of this vul­ner­a­bil­ity be­ing ex­ploited in the real-life us­age of our so­lu­tions.” A Cellebrite cus­tomer who com­mented to Vice said, It ap­pears to be an at­tempt to min­i­mize the at­tack sur­face[,] not a fix[.]’”

From the news re­ports, it sounds like Cellebrite has tem­porar­ily turned off iPhone sup­port for the Physical Analyzer tool. (Note Cellebrite only turned off sup­port for Physical Analyzer, even though the Signal blog post’s demo was about the UFED soft­ware and they said sim­i­lar ex­ploits ex­ist for Physical Analyzer.) You’ll re­call that Physical Analyzer is the sec­ond part of the two-part sys­tem. UFED cre­ates the backup, Physical Analyzer parses the files.

But even though UFED has vulns too, Cellebrite cus­tomers can still use UFED to dump the data from iPhones onto a lo­cal backup. You can back up the data but you can’t do any­thing with it for now. That’s still kinda weird, be­cause if vulns in UFED could also al­ter data, why keep sup­port for UFED on? Isn’t there a risk that those data dumps could be al­tered? My guess: Cellebrite’s go­ing half­sies be­cause it would be even more dis­as­trous for their busi­ness to yank sup­port for both prod­ucts, and they’re con­fi­dent enough that there aren’t any real-world ex­ploits for UFED that they left it work­ing for iPhones, since they fig­ure cus­tomers will want to keep pre­serv­ing ev­i­dence with those data dumps (which is surely eas­ier than keep­ing the phone pow­ered on, charged, and in an un­locked state in­def­i­nitely), but they deemed the Physical Analyzer vulns more dan­ger­ous, so that’s the part they de­cided to pause for now. But that’s just my guess. In any event, this is just a Band-Aid so­lu­tion: Cellebrite will have to re­store iOS sup­port for Physical Analyzer sooner or later.

It’s like there’s a bull that’s in the yard out­side a china shop, and it’s been locked in the yard in­side the fence. So it’s be­ing con­tained there. That’s not a long-term so­lu­tion, and the bull might still do dam­age to the yard, but the own­ers of the china shop think the bull will prob­a­bly be chill, and any dam­age won’t be as bad as it would be if the bull were to get in­side the china shop. And to keep the bull from go­ing in­side the china shop, for now, they boarded over the door to the china shop. But in­side, the shop is still full of frag­ile, break­able china. It won’t be safe to turn the Physical Analyzer back on un­til they’ve con­verted the china to adaman­tium or some­thing. (Yeah, sorry, it’s not the best metaphor.)

So what does Signal’s stunt mean for law en­force­ment use of Cellebrite?

The jour­nal­ist Thomas Fox-Brewster sum­ma­rized the the­o­ret­i­cal fall­out suc­cinctly in Forbes:

This could be a se­vere is­sue for the many po­lice agen­cies us­ing Cellebrite across the world. If a crim­i­nal can hack a Cellebrite de­vice by run­ning a ma­li­cious file like the one de­scribed by Marlinspike, they could spoil ev­i­dence.”

No, in­ten­tion­ally spoil­ing ev­i­dence — or spoliating,” to use the le­gal term — is def­i­nitely not le­gal.

Neither is hack­ing some­body’s com­puter, which is what Signal’s blog post is say­ing a real ex­ploit pay­load” could do. It said, a real ex­ploit pay­load would likely seek to un­de­tectably al­ter pre­vi­ous re­ports, com­pro­mise the in­tegrity of fu­ture re­ports (perhaps at ran­dom!), or ex­fil­trate data from the Cellebrite ma­chine.” All of those things are a vi­o­la­tion of the fed­eral anti-hack­ing law known as the Computer Fraud and Abuse Act, or CFAA, and prob­a­bly also of many state-law ver­sions of the CFAA. (If the com­puter be­longs to a fed­eral law en­force­ment agency, it’s def­i­nitely a CFAA vi­o­la­tion. If it’s a state, lo­cal, or tribal gov­ern­ment law en­force­ment agency, then, be­cause of how the CFAA de­fines protected com­put­ers” cov­ered by the Act, it might de­pend on whether the Windows ma­chine that’s used for Cellebrite ex­trac­tions is con­nected to the in­ter­net or not. That ma­chine should be seg­mented apart from the rest of the po­lice de­part­men­t’s net­work, but if it has an in­ter­net con­nec­tion, the CFAA ap­plies. And even if it does­n’t, I bet there are other ways of eas­ily sat­is­fy­ing the protected com­puter” de­f­i­n­i­tion.)

So is, uh, is Signal go­ing to up­date its app to make it hack po­lice com­put­ers? Recall what Signal said about how upcoming ver­sions of Signal will be pe­ri­od­i­cally fetch­ing files to place in app stor­age…”. It’s very cutesy, coy, eva­sive lan­guage and it does­n’t say ex­actly what the hell they mean by that. They’re wink­ing and smil­ing and nudg­ing the reader in­stead of be­ing clear.

They seem to be im­ply­ing — or at least they seem to in­tend for the reader, and more im­por­tantly Cellebrite and its cus­tomers, to in­fer — that Signal will add innocuous” code to their app that might, maybe, al­ter the data on a Cellebrite ma­chine if the phone gets plugged into it. If they’re say­ing what they’re hint­ing they’re say­ing, Signal ba­si­cally an­nounced that they plan to up­date their app to hack law en­force­ment com­put­ers and also tam­per with and spo­li­ate ev­i­dence in crim­i­nal cases.

When you put it that way, it be­comes clear why they were us­ing such coy lan­guage and why I bet they’re bluff­ing: Those things are il­le­gal. It’s a stunt that could get their own users in trou­ble (if the user gets blamed for what her phone does to a Cellebrite ma­chine, she will be plunged into a world of pain, ir­re­spec­tive of whether she would ul­ti­mately be held cul­pa­ble for the de­sign of an app she had in­stalled on her phone), and could get them in hot wa­ter (because they in­ten­tion­ally de­signed and put those booby-trapped files on the user’s phone).

Plus, ad­mit­tedly I haven’t ac­tu­ally looked into this at all, but it seems like it could get Signal kicked out of the Apple and Google app stores, if the com­pa­nies in­ter­pret this as a vi­o­la­tion of their app store rules against mal­ware. (It would­n’t ac­tu­ally help pro­tect pri­vacy or free ex­pres­sion or hu­man rights, as Signal prides it­self on do­ing, if peo­ple can’t in­stall and up­date the app, or if they side­load ma­li­cious fake ver­sions of Signal that some cy­ber­crime gang or evil gov­ern­ment puts out there.)

So my guess is that they’re play­ing this nudge-wink, plau­si­ble de­ni­a­bil­ity, vague lan­guage game, where maybe you might in­fer that they’re go­ing to make their app hack Cellebrite ma­chines and spoil ev­i­dence, but in ac­tu­al­ity they never had any in­ten­tion of ac­tu­ally do­ing that. It was just to mess with Cellebrite and make a point. At most, maybe they stick some files in app stor­age that don’t do any­thing ma­li­cious at all. And maybe Cellebrite’s prompt re­sponse con­ve­niently gave Signal an out from hav­ing to fol­low through, on top of the plau­si­ble de­ni­a­bil­ity of their cutesy eva­sive lan­guage.

Still, it’s a weird choice to make, for the pub­lic-fac­ing of­fi­cial com­mu­ni­ca­tions of an or­ga­ni­za­tion that makes an app with mil­lions of users around the world, to kinda-sorta vaguely an­nounce that you maybe just might re­design your app to break the law and screw with law en­force­ment.

Will this mean a bunch of de­fen­dants’ crim­i­nal cases get thrown out?

This is just a PoC. Yes, re­search showed there’s this flaw, but Signal’s demo is just a demo. It does­n’t mean the vuln they found was ever ac­tu­ally ex­ploited in the wild. Cellebrite told their cus­tomers they don’t be­lieve it was, though they did­n’t say how they reached that con­clu­sion. (And, well, there are ob­vi­ous rea­sons to be skep­ti­cal of the qual­ity of their in­ci­dent re­sponse.)

But crim­i­nal de­fense at­tor­neys are still go­ing to try to make use of this — as they should; they should hold the pros­e­cu­tion ac­count­able for the re­li­a­bil­ity of the ev­i­dence used against their clients. There’s a case in state court in West Virginia where the case al­ready went to trial, Cellebrite ev­i­dence was in­tro­duced, the de­fen­dant was con­victed, and based on this blog post, the at­tor­ney moved for a new trial and to ex­am­ine the Cellebrite ma­chine. I sus­pect there’ll be other at­tor­neys fil­ing sim­i­lar mo­tions.

My guess is that these de­fense lawyers are un­likely to get their clients a new trial in many, if any, of the cases where a ver­dict has al­ready been re­turned; but that in any on­go­ing open cases, those lawyers have bet­ter odds of get­ting the court to grant them the chance to ex­am­ine the Cellebrite ma­chine if they did­n’t do so be­fore (or maybe to ex­am­ine it again if they did).

The thing to un­der­stand is that the mere spec­u­la­tive pos­si­bil­ity that data in a Cellebrite re­port might have been al­tered is­n’t go­ing to sway any judges. If you’re a de­fense at­tor­ney, just show­ing the court this blog post say­ing oh, Cellebrite soft­ware has a lot of vulns, and there was this vuln in par­tic­u­lar, here’s a sam­ple ex­ploit for it, and oh by the way maybe Signal will do some­thing to ex­ploit it in fu­ture ver­sions of the Signal app”: that’s not go­ing to be enough.

Another rea­son that le­gal chal­lenges prob­a­bly won’t go very far is that it should be pretty straight­for­ward for law en­force­ment to dis­prove an ac­cu­sa­tion about the Cellebrite ma­chine. In a re­cent le­gal we­bi­nar about mo­bile de­vice foren­sics tools, the dis­cus­sion touched upon Signal’s Cellebrite hack. One of the pan­elists pointed out that Cellebrite’s not the only game in town when it comes to these ex­trac­tion tools. It’s a whole in­dus­try, it’s not just this one com­pany, al­though Cellebrite is prob­a­bly the best-known ac­tor in that in­dus­try. Therefore, as the pan­elist pointed out, if you’re law en­force­ment, you can just per­form the same ex­trac­tion through a dif­fer­ent pro­gram, and there won’t be a prob­lem be­cause this flaw is unique to Cellebrite. Sure, prob­a­bly those other com­pa­nies’ tools have bugs too (and they should get their act to­gether too), but there’s been no show­ing that every other tool out there has an iden­ti­cal flaw that could be ex­ploited in an iden­ti­cal way. So Signal’s hack does­n’t draw into doubt all mo­bile de­vice foren­sics tools.

Thus, if there’s a chal­lenge to the in­tegrity of Cellebrite data in a par­tic­u­lar crim­i­nal case, the pros­e­cu­tion should be able to read­ily prove there’s no cor­rup­tion by just run­ning the ex­trac­tion through more than one foren­sic pro­gram be­sides Cellebrite. Then they could com­pare the out­puts of the two tools and see if there are dif­fer­ences. Or, they could just not use Cellebrite at all and just use the other tool. (Of course, in­duc­ing the cops to stop us­ing Cellebrite would be some sweet re­venge for Signal.)

In some cases, the pros­e­cu­tion could also po­ten­tially use wit­ness tes­ti­mony by a law en­force­ment of­fi­cer to cor­rob­o­rate what’s in the Cellebrite re­port and show that the re­port is ac­cu­rate. Remember, this Cellebrite two-part sys­tem works on al­ready un­locked phones. And crim­i­nal sus­pects will of­ten con­sent to un­lock their phones when the po­lice ask them to. (You don’t have to, you can say no, but peo­ple of­ten say yes.) If that hap­pened in a case where the Cellebrite data was chal­lenged, the state could call the cop to tes­tify and the cop might be able to say, The de­fen­dant un­locked his phone for me and I flipped through it and saw these in­crim­i­nat­ing texts in Signal with the time­stamp on them from such-and-such a date, and that was be­fore we took the phone back to the po­lice sta­tion and plugged the phone into the Cellebrite ma­chine. And look­ing at this Cellebrite re­port, yep, what’s in the re­port matches up with my mem­ory: same text mes­sages, same time­stamp as I re­mem­ber see­ing when I was do­ing my man­ual in­spec­tion of the con­tents of his phone.”

The point be­ing, as I wrote on Twitter at the time, these chal­lenges will go nowhere un­less the de­fense can come up with some plau­si­ble ac­tual ev­i­dence of cor­rup­tion of the Cellebrite ma­chine or the data ex­tracted. There has to be some­thing to show that the Cellebrite data ex­trac­tion us­ing that Cellebrite tool on this phone is not re­li­able. No judge will throw out ev­i­dence from a Cellebrite analy­sis just be­cause Signal did a PoC.

So I hope lawyers are ad­vis­ing their clients that mov­ing for a new trial, or mov­ing to (re)examine the Cellebrite de­vice as part of an on­go­ing pros­e­cu­tion, solely on the ba­sis of Signal’s blog post, is a Hail Mary move that will prob­a­bly go nowhere. Lawyers ab­solutely should do this, in or­der to zeal­ously rep­re­sent their clients (and put the pros­e­cu­tion through its paces), but it prob­a­bly won’t change the re­sult.

What if it turns out the Cellebrite data was cor­rupted?

Even in the un­likely event that it turns out this ex­ploit was used in the wild, or if a Cellebrite ma­chine oth­er­wise turns out af­ter test­ing to be un­re­li­able, that’s not a guar­an­tee that every crim­i­nal case out there that in­volved a Cellebrite (which is prob­a­bly a lot of cases) de­serves to get the con­vic­tion thrown out. It may hap­pen in some cases, but not in all.

That’s be­cause of a doc­trine in the law that, while it’s com­pli­cated in prac­tice, es­sen­tially boils down to say­ing, OK, so the un­re­li­able data should­n’t have been al­lowed into ev­i­dence. Let’s pre­tend that it had­n’t been ad­mit­ted: would that have had a big ef­fect on the ju­ry’s ver­dict, or would it prob­a­bly have been the same as it was when the un­re­li­able ev­i­dence was let in?” If the jury prob­a­bly would’ve con­victed any­way even with­out the Cellebrite ev­i­dence, then the guilty con­vic­tion will stand.

My guess is that it’s pretty rare that the Cellebrite ev­i­dence is the dis­pos­i­tive crux of a case, mean­ing, but for ev­i­dence pulled from a phone us­ing a Cellebrite de­vice, the jury would not have voted to con­vict. Surely it hap­pens some­times, but not in every case. Plus, the courts have an un­for­tu­nate ten­dency to say yeah, the jury would’ve con­victed any­way.” This doc­trine does­n’t ac­tu­ally come out su­per fa­vor­ably to the de­fense when ap­plied in real cases. Like many things about the American jus­tice sys­tem, it’s not fair but it’s the re­al­ity.

Plus, there’s a rea­son why ju­ries would likely con­vict any­way in many cases. In a typ­i­cal case, the Cellebrite data will be just one part of the ev­i­dence pre­sented by the pros­e­cu­tion. In many cases the pros­e­cu­tion will have a lot of ev­i­dence that they can ob­tain from on­line ser­vice providers rather than just re­ly­ing on the lo­cal client-side copy that was ex­tracted us­ing Cellebrite. That might in­clude, for ex­am­ple, iCloud back­ups; phone records from the phone com­pany show­ing whom you called or texted and who called or texted you; cell-site lo­ca­tion in­for­ma­tion, GPS data, or other lo­ca­tion in­for­ma­tion show­ing where you were when; emails from Gmail; posts and DMs from so­cial me­dia; and so on. Much of the data about us is in the cloud, or oth­er­wise held by third par­ties and read­ily avail­able to the gov­ern­ment with the right le­gal process.

Signal is an out­lier in that it keeps al­most zero data about its users. All they can pro­vide in re­sponse to a sub­poena or other le­gal process is Unix time­stamps for when each ac­count was cre­ated and the date that each ac­count last con­nected to the Signal ser­vice. That’s it.” So you can see why law en­force­ment would re­ally want Cellebrite to work for Signal data in par­tic­u­lar: a user’s phone is the only place to get Signal mes­sages. (Of course, by de­f­i­n­i­tion there are mul­ti­ple par­tic­i­pants in any con­ver­sa­tion. QED.) Still, Signal data is of­ten go­ing to be one piece of ev­i­dence among many.

What’s more, dig­i­tal ev­i­dence is­n’t the only ev­i­dence. A typ­i­cal case may also in­volve tes­ti­mony by mul­ti­ple wit­nesses (perhaps in­clud­ing a po­lice of­fi­cer who saw in­crim­i­nat­ing files on the phone while go­ing through it man­u­ally, as noted above), a con­fes­sion by the de­fen­dant, tes­ti­mony by a co-con­spir­a­tor who turns state’s ev­i­dence and points the fin­ger at the de­fen­dant, hard-copy doc­u­ments and other items seized with a search war­rant, and so on.

Why does this mat­ter, then, if Cellebrite mit­i­gated the flaw and no­body’s case gets thrown out?

It mat­ters be­cause peo­ple have a con­sti­tu­tional right to a fair trial, and to con­front the ev­i­dence against them, thanks to the Sixth Amendment. We also have a con­sti­tu­tional right to pro­ce­dural due process un­der the Fifth Amendment, mean­ing that if you are haled into court, there are rules; it’s not just a Kafka-esque show trial in a kan­ga­roo court where any­thing goes. Signal’s hack demon­strated an im­por­tant point: if you’re go­ing to con­vict some­body of a crime, put them be­hind bars, and take away their free­dom, based in part on the re­ports from a com­puter sys­tem, then at least that sys­tem should have ad­e­quate se­cu­rity.

Moxie’s hack might af­fect past cases, though I doubt it’ll change the out­come in many, if any. And, this par­tic­u­lar find­ing won’t af­fect fu­ture cases. That’s be­cause Cellebrite ap­par­ently has al­ready is­sued a mit­i­ga­tion (a stop-gap mea­sure to stanch the bleed­ing). (Which, if you’re law en­force­ment, you’re fum­ing mad at Cellebrite: what the hell did you pay them all this money for? As said, Cellebrite will have to get Physical Analyzer work­ing on iPhones again be­fore too long.)

Going for­ward, Signal’s hack may in­duce more de­fense at­tor­neys to de­mand to ex­am­ine Cellebrite de­vices more of­ten — not just be­cause of this bug, but be­cause if this ex­ploitable bug was in there, surely there are oth­ers, too. The blog post makes it sound like Cellebrite’s se­cu­rity is so bad that there’s plenty of low-hang­ing fruit in there.

Giving de­fense at­tor­neys more ammo to push back harder against the use of Cellebrite de­vices against their clients is Good and Right and Just. The gen­eral point that Moxie made —  Cellebrite’s tools are buggy A. F. and can be ex­ploited in ways that un­der­mine the re­li­a­bil­ity of their re­ports and ex­trac­tions as ev­i­dence, which is the en­tire point of their ex­is­tence — is ac­tu­ally more im­por­tant than the specifics of this ex­ploit or of Signal’s an­nounced app up­date (because Cellebrite’s al­ready kinda-sorta mit­i­gated against this ex­ploit).

The big­ger point is that Cellebrite, like other law en­force­ment tech ven­dors, plays fast and loose with the tech­nol­ogy that they’re sell­ing. Law en­force­ment and pros­e­cu­tors then rely upon these tools, de­spite their half-assed se­cu­rity, to jus­tify tak­ing peo­ple’s free­dom away. So Signal turned the ta­bles on them, and they showed that the em­peror has no clothes. Signal did a pub­lic ser­vice by desta­bi­liz­ing the per­cep­tion of re­li­a­bil­ity in Cellebrite.

Even if Signal’s blog post does­n’t get any­body a new trial, it proves the point that courts should­n’t rely so read­ily on Cellebrite or other such law en­force­ment tech­nol­ogy. Megan Graham, who su­per­vises a tech­nol­ogy law & pub­lic pol­icy clinic at Berkeley Law, dis­cussed the likely ram­i­fi­ca­tions of Signal’s hack in a thread on Twitter and com­ments to the press. Drawing on her deep ex­pe­ri­ence work­ing on these is­sues, she noted that law en­force­ment tech ven­dors’ ap­proach to se­cu­rity is usu­ally ba­si­cally YOLO.” She was­n’t sur­prised at how bad Cellebrite’s se­cu­rity is.

As Graham said, hope­fully the take­away for judges is to dig more in the fu­ture into how re­li­able these law en­force­ment tech­nolo­gies ac­tu­ally are be­fore al­low­ing in ev­i­dence that was ob­tained us­ing these tools. This could be a wake-up call for courts — but it’s gonna be an up­hill bat­tle.

There are big ob­sta­cles be­tween the world as it is and the world as it should be. The courts — es­pe­cially the state courts, like the one in West Virginia where that one lawyer al­ready asked for a new trial — are very busy and very short on re­sources. (Graham and I both know that first­hand, as we both used to be the clerks to fed­eral mag­is­trate judges.) Judges don’t have a lot of time; they have re­ally heavy case­loads so there’s just too much to do. They prob­a­bly lack the back­ground to un­der­stand these new tech­nolo­gies on their own. They don’t nec­es­sar­ily have the in-house re­sources and per­son­nel to do it for them, be­cause court bud­gets are al­ways strapped. That said, judges can and do at­tend train­ings on cur­rent top­ics and new tech­nolo­gies like this… if some­body’s of­fer­ing them. (And who’s of­fer­ing them may be pros­e­cu­tors and the ven­dors of these tools.)

And any­way, judges don’t run the par­ties’ cases for them. Judges can de­cide sua sponte” (on their own) to chal­lenge some­thing that one of the par­ties says or does, they can de­cide on their own to tell one of the par­ties to do some­thing in par­tic­u­lar, but for the most part, ob­ject­ing to some­thing one of the par­ties says or does is the lawyers’ job, not the judge’s. So it’s usu­ally go­ing to be up to the de­fense lawyer in these cases to bring a chal­lenge to a par­tic­u­lar foren­sic tool or process, or chal­lenge an ex­pert wit­ness’s qual­i­fi­ca­tions.

That costs time and money and re­sources, and those just won’t al­ways be avail­able to every de­fen­dant, who, let’s face it, is play­ing on an un­level play­ing field in the American crim­i­nal jus­tice sys­tem — by de­sign. In our sys­tem, the cards are stacked in fa­vor of the pros­e­cu­tion. And that’s why com­pa­nies like Cellebrite get away with sloppy work.

So Signal’s stunt should be a wake-up call to the courts that these tools aren’t as re­li­able as they’re held out to be, and they re­ally ought to be a lot more se­cure, and the courts should re­ally dig in there more. But the process for ac­tu­ally hold­ing these law en­force­ment tech­nol­ogy ven­dors to ac­count and forc­ing them to do bet­ter is very slow. We’re not go­ing to see a mas­sive sea change overnight just from this blog post. The po­lice de­part­ments that use Cellebrite, and the mo­bile de­vice foren­sics in­dus­try in gen­eral, are on no­tice that they need to get their act to­gether. But un­less and un­til judges and crim­i­nal de­fense at­tor­neys force them to — or, per­haps more re­al­is­ti­cally, un­less and un­til their law en­force­ment cus­tomers force them to by re­fus­ing to give them any more money un­til they step their game up — com­pa­nies like Cellebrite will con­tinue to skate by. And that’s the value of this hack. It’s one step to­wards forc­ing more ac­count­abil­ity.

Plus, the prob­lem is not just mo­bile de­vice foren­sics tools

Another rea­son that this topic is im­por­tant is that Cellebrite is not the only ex­am­ple of a tech­nol­ogy that gets sold by pri­vate-sec­tor ven­dors to some unit of gov­ern­ment — law en­force­ment, a state ad­min­is­tra­tive agency, or the courts, for ex­am­ple — which then gets used to help con­vict crim­i­nal de­fen­dants or oth­er­wise af­fect peo­ple’s rights and lives and liveli­hoods. There’s a ton of ven­dors out there that sell their tools to some part or an­other of the coun­try’s state, lo­cal, tribal, and fed­eral gov­ern­ments.

But their tools are of­ten a black box. It’s not clear how they work, or whether they ac­tu­ally work the way the ven­dors say they do. The ven­dors aren’t above mak­ing the gov­ern­ment cus­tomers that buy their tools sign a non-dis­clo­sure agree­ment say­ing they won’t dis­close any­thing about the tool they’re pay­ing for — de­spite the fact that the use of these tools can im­pli­cate peo­ple’s con­sti­tu­tional rights, which is not some­thing you can just wipe away with a con­tract.

Other ex­am­ples of tech­nolo­gies that have been paid for with your tax dol­lars in­clude:

(1) TrueAllele, a soft­ware pro­gram for help­ing law en­force­ment an­a­lyze DNA sam­ples by us­ing what’s called probabilistic geno­typ­ing, which uses com­plex math­e­mat­i­cal for­mu­las to ex­am­ine the sta­tis­ti­cal like­li­hood that a cer­tain geno­type comes from one in­di­vid­ual over an­other,” which costs $60,000 to li­cense, and whose ven­dor has fought tooth and nail to keep its source code from be­ing ex­am­ined by crim­i­nal de­fense coun­sel, cit­ing trade se­crecy (but, in at least one re­cent case, los­ing);

(2) Stingray de­vices, aka IMSI catch­ers — de­vices used by law en­force­ment which mimic cell phone tow­ers and force the phones of every­body in the area to con­nect to them, whose ven­dor made law en­force­ment agen­cies sign NDAs that led the agen­cies to out­right lie to courts in mul­ti­ple cases about the very ex­is­tence and use of Stingray de­vices in crim­i­nal cases; and

(3) al­go­rithms that are used by the state to make de­ci­sions about such cru­cial choices as:

- whether ar­restees who are in jail should or should­n’t be let out on bail

- whether em­ploy­ees should be given or de­nied a

should get ben­e­fits such as Medicaid, Medicare, un­em­ploy­ment, and Social Security Disability Insurance.

Understandably, the peo­ple who are on the re­ceiv­ing end of these tech­nolo­gies want to know how the tools work. So their lawyers push for ac­cess to peek in­side the black box and ex­am­ine the hard­ware and/​or soft­ware and even the source code, to see how it works and try to fig­ure out if it’s flawed. And of­ten the re­sponse, as with TrueAllele and Stingrays, is that ei­ther the state, or the pri­vate-sec­tor ven­dor that makes the tool, or both, will try to keep what’s un­der the hood a se­cret. They try to keep the black box closed. These black-box chal­lenges have been a long, hard slog for ad­vo­cates fight­ing for more trans­parency and fair­ness, and the win rate is far less than 100%.

Signal ba­si­cally short-cir­cuited that whole process by just get­ting their hands on this piece of law en­force­ment tech­nol­ogy, tear­ing it apart, and pub­lish­ing some of what they found. That set the stage for ad­di­tional or re­newed chal­lenges by de­fense coun­sel to de­mand to ex­am­ine the tools.

This is why white-hat se­cu­rity re­search (and up­dat­ing the law to pro­tect it) is so im­por­tant. Private-sector ven­dors like Cellebrite that sell their tech­nol­ogy to the pub­lic sec­tor have a good thing go­ing: they get that sweet sweet tax­payer money on those con­tracts, they have lit­tle in­cen­tive to dot their i’s and cross their t’s in terms of prod­uct qual­ity so long as the cus­tomer is sat­is­fied (because the peo­ple who are sub­jected to their tools are not the cus­tomer), and they can some­times get away with keep­ing their tools’ in­ner work­ings a se­cret. But white-hat se­cu­rity re­search does­n’t nec­es­sar­ily color in­side the lines that the ven­dor dic­tates, and it can be an im­por­tant way to get cru­cial in­for­ma­tion about these gov-tech tools that the ven­dor would not share will­ingly.

In sum, Signal’s hack was a stunt that has al­ready been mit­i­gated and prob­a­bly won’t set any­body free from prison. But that does­n’t mean this was all in vain. The sil­ver lin­ing is that hope­fully white-hat se­cu­rity re­search like this will push crim­i­nal de­fense lawyers, courts, and law en­force­ment agen­cies to make ven­dors like Cellebrite do a bet­ter job… ide­ally be­fore the black hats take ad­van­tage of their slop­pi­ness.

Yes, Signal did a cool hack. Overall it was a proso­cial move. But it was also a stunt, and Signal’s blog post was vague and con­fus­ing and seemed to sug­gest, in a cutesy, plau­si­bly de­ni­able way, that the Signal app is go­ing to be up­dated so as to hack po­lice com­put­ers.

So while com­puter se­cu­rity folks were gig­gling at Signal’s cute, clever blog post, lawyers like me were sigh­ing. Why? Because of an im­por­tant life les­son that en­gi­neers typ­i­cally don’t un­der­stand: Judges hate cute and clever.

In gen­eral, if you do some­thing very clever and you show it off in a cute pre­sen­ta­tion, it won’t go over well with a judge. They have no pa­tience for stunts and show­boat­ing. The court­room is not the stage at DEF CON. And judges do not like mealy-mouthed vague state­ments that are de­signed for plau­si­ble de­ni­a­bil­ity.

...

Read the original on cyberlaw.stanford.edu »

7 231 shares, 9 trendiness, words and minutes reading time

lana-k/sqliteviz

Sign in

Use Git or check­out with SVN us­ing the web URL.

Work fast with our of­fi­cial CLI. Learn more.

If noth­ing hap­pens, down­load GitHub Desktop and try again.

If noth­ing hap­pens, down­load GitHub Desktop and try again.

If noth­ing hap­pens, down­load Xcode and try again.

If noth­ing hap­pens, down­load the GitHub ex­ten­sion for Visual Studio and try again.

Permalink

Sqliteviz is a sin­gle-page of­fline-first PWA for fully client-side vi­su­al­i­sa­tion of SQLite data­bases or CSV files.

With sqlite­viz you can:

run SQL queries against a SQLite data­base and cre­ate Plotly charts based on the re­sult sets

man­age queries and chart set­tings and run them against dif­fer­ent data­bases

use it of­fline from your OS ap­pli­ca­tion menu like any other desk­top app

The lat­est re­lease of sqlite­viz is de­ployed on GitHub Pages at lana-k.github.io/​sqlite­viz.

It’s a kind of mid­dle­ground be­tween Plotly Falcon and Redash.

It is built on top of re­act-chart-ed­i­tor, sql.js and Vue-Codemirror in Vue.js. CSV pars­ing is per­formed with Papa Parse.

You can’t per­form that ac­tion at this time.

You signed in with an­other tab or win­dow. Reload to re­fresh your ses­sion.

You signed out in an­other tab or win­dow. Reload to re­fresh your ses­sion.

...

Read the original on github.com »

8 224 shares, 7 trendiness, words and minutes reading time

Nim forum

...

Read the original on forum.nim-lang.org »

9 218 shares, 10 trendiness, words and minutes reading time

Bibliogram

Bibliogram is a web­site that takes data from Instagram’s pub­lic pro­file views and puts it into a friend­lier page that loads faster, gives down­load­able im­ages, elim­i­nates ads, gen­er­ates RSS feeds, and does­n’t urge you to sign up. See an ex­am­ple.

Bibliogram does not al­low you to anony­mously post, like, com­ment, fol­low, or view pri­vate pro­files. It does not pre­serve deleted posts.

...

Read the original on bibliogram.art »

10 199 shares, 10 trendiness, words and minutes reading time

Shirts of Peter Norvig

...

Read the original on charlesbroskoski.com »

To add this web app to your iOS home screen tap the share button and select "Add to the Home Screen".

10HN is also available as an iOS App

If you visit 10HN only rarely, check out the the best articles from the past week.

If you like 10HN please leave feedback and share

Visit pancik.com for more.