10 interesting stories served every morning and every evening.




1 1,412 shares, 140 trendiness

ChatGPT plugins

Language mod­els to­day, while use­ful for a va­ri­ety of tasks, are still lim­ited. The only in­for­ma­tion they can learn from is their train­ing data. This in­for­ma­tion can be out-of-date and is one-size fits all across ap­pli­ca­tions. Furthermore, the only thing lan­guage mod­els can do out-of-the-box is emit text. This text can con­tain use­ful in­struc­tions, but to ac­tu­ally fol­low these in­struc­tions you need an­other process.

Though not a per­fect anal­ogy, plu­g­ins can be eyes and ears” for lan­guage mod­els, giv­ing them ac­cess to in­for­ma­tion that is too re­cent, too per­sonal, or too spe­cific to be in­cluded in the train­ing data. In re­sponse to a user’s ex­plicit re­quest, plu­g­ins can also en­able lan­guage mod­els to per­form safe, con­strained ac­tions on their be­half, in­creas­ing the use­ful­ness of the sys­tem over­all.

We ex­pect that open stan­dards will emerge to unify the ways in which ap­pli­ca­tions ex­pose an AI-facing in­ter­face. We are work­ing on an early at­tempt at what such a stan­dard might look like, and we’re look­ing for feed­back from de­vel­op­ers in­ter­ested in build­ing with us.

Today, we’re be­gin­ning to grad­u­ally en­able ex­ist­ing plu­g­ins from our early col­lab­o­ra­tors for ChatGPT users, be­gin­ning with ChatGPT Plus sub­scribers. We’re also be­gin­ning to roll out the abil­ity for de­vel­op­ers to cre­ate their own plu­g­ins for ChatGPT.

In the com­ing months, as we learn from de­ploy­ment and con­tinue to im­prove our safety sys­tems, we’ll it­er­ate on this pro­to­col, and we plan to en­able de­vel­op­ers us­ing OpenAI mod­els to in­te­grate plu­g­ins into their own ap­pli­ca­tions be­yond ChatGPT.

...

Read the original on openai.com »

2 1,038 shares, 34 trendiness

GitHub Copilot X: The AI-powered developer experience

At GitHub, our mis­sion has al­ways been to in­no­vate ahead of the curve and give de­vel­op­ers every­thing they need to be hap­pier and more pro­duc­tive in a world pow­ered by soft­ware. When we be­gan ex­per­i­ment­ing with large lan­guage mod­els sev­eral years ago, it quickly be­came clear that gen­er­a­tive AI rep­re­sents the fu­ture of soft­ware de­vel­op­ment. We part­nered with OpenAI to cre­ate GitHub Copilot, the world’s first at-scale gen­er­a­tive AI de­vel­op­ment tool made with OpenAI’s Codex model, a de­scen­dent of GPT-3.

GitHub Copilot started a new age of soft­ware de­vel­op­ment as an AI pair pro­gram­mer that keeps de­vel­op­ers in the flow by auto-com­plet­ing com­ments and code. And less than two years since its launch, GitHub Copilot is al­ready writ­ing 46% of code and helps de­vel­op­ers code up to 55% faster.

But AI-powered auto-com­ple­tion is just the start­ing point. Our R&D team at GitHub Next has been work­ing to move past the ed­i­tor and evolve GitHub Copilot into a read­ily ac­ces­si­ble AI as­sis­tant through­out the en­tire de­vel­op­ment life­cy­cle. This is GitHub Copilot X—our vi­sion for the fu­ture of AI-powered soft­ware de­vel­op­ment. We are not only adopt­ing OpenAI’s new GPT-4 model, but are in­tro­duc­ing chat and voice for Copilot, and bring­ing Copilot to pull re­quests, the com­mand line, and docs to an­swer ques­tions on your pro­jects.

With AI avail­able at every step, we can fun­da­men­tally re­de­fine de­vel­oper pro­duc­tiv­ity. We are re­duc­ing boil­er­plate and man­ual tasks and mak­ing com­plex work eas­ier across the de­vel­oper life­cy­cle. By do­ing so, we’re en­abling every de­vel­oper to fo­cus all their cre­ativ­ity on the big pic­ture: build­ing the in­no­va­tion of to­mor­row and ac­cel­er­at­ing hu­man progress, to­day.

Want to see what’s new?  Discover GitHub Copilot X—our vi­sion for the fu­ture of AI-powered soft­ware de­vel­op­ment. Learn more >

* A ChatGPT-like ex­pe­ri­ence in your ed­i­tor with GitHub Copilot Chat: We are bring­ing a chat in­ter­face to the ed­i­tor that’s fo­cused on de­vel­oper sce­nar­ios and na­tively in­te­grates with VS Code and Visual Studio. This does far more than sug­gest code. GitHub Copilot Chat is not just a chat win­dow. It rec­og­nizes what code a de­vel­oper has typed, what er­ror mes­sages are shown, and it’s deeply em­bed­ded into the IDE. A de­vel­oper can get in-depth analy­sis and ex­pla­na­tions of what code blocks are in­tended to do, gen­er­ate unit tests, and even get pro­posed fixes to bugs.

GitHub Copilot Chat builds upon the work that OpenAI and Microsoft have done with ChatGPT and the new Bing. It will also join our voice-to-code AI tech­nol­ogy ex­ten­sion we pre­vi­ously de­moed, which we’re now call­ing GitHub Copilot Voice, where de­vel­op­ers can ver­bally give nat­ural lan­guage prompts.

Sign up for the tech­ni­cal pre­view >

* Copilot for Pull Requests: You can now sign up for a tech­ni­cal pre­view of the first AI-generated de­scrip­tions for pull re­quests on GitHub. This new func­tion­al­ity is pow­ered by OpenAI’s new GPT-4 model and adds sup­port for AI-powered tags in pull re­quest de­scrip­tions through a GitHub app that or­ga­ni­za­tion ad­mins and in­di­vid­ual repos­i­tory own­ers can in­stall. These tags are au­to­mat­i­cally filled out by GitHub Copilot based on the changed code. Developers can then re­view or mod­ify the sug­gested de­scrip­tion.

Enroll your repos­i­tory in the tech­ni­cal pre­view >

This is just the first step we’re tak­ing to re­think how pull re­quests work on GitHub. We’re test­ing new ca­pa­bil­i­ties in­ter­nally where GitHub Copilot will au­to­mat­i­cally sug­gest sen­tences and para­graphs as de­vel­op­ers cre­ate pull re­quests by dy­nam­i­cally pulling in in­for­ma­tion about code changes.

We are also prepar­ing a new fea­ture where GitHub Copilot will au­to­mat­i­cally warn de­vel­op­ers if they’re miss­ing suf­fi­cient test­ing for a pull re­quest and then sug­gest po­ten­tial tests that can be edited, ac­cepted, or re­jected based on a pro­jec­t’s needs.

This com­ple­ments our ef­forts with GitHub Copilot Chat where de­vel­op­ers can ask GitHub Copilot to gen­er­ate tests right from their ed­i­tor—so, in the event a de­vel­oper may not have suf­fi­cient test cov­er­age, GitHub Copilot will alert them once they sub­mit a pull re­quest. It will also help pro­ject own­ers to set poli­cies around test­ing, while sup­port­ing de­vel­op­ers to meet these poli­cies.

* Get AI-generated an­swers about doc­u­men­ta­tion: We are launch­ing GitHub Copilot for Docs, an ex­per­i­men­tal tool that uses a chat in­ter­face to pro­vide users with AI-generated re­sponses to ques­tions about doc­u­men­ta­tion—in­clud­ing ques­tions de­vel­op­ers have about the lan­guages, frame­works, and tech­nolo­gies they’re us­ing. We’re start­ing with doc­u­men­ta­tion for React, Azure Docs, and MDN, so we can learn and it­er­ate quickly with the de­vel­op­ers and users of these pro­jects.

We’re also work­ing to bring this func­tion­al­ity to any or­ga­ni­za­tion’s repos­i­to­ries and in­ter­nal doc­u­men­ta­tion—so any de­vel­oper can ask ques­tions via a ChatGPT-like in­ter­face about doc­u­men­ta­tion, id­iomatic code, or in-house soft­ware in their or­ga­ni­za­tion and get in­stant an­swers.

We know that the ben­e­fits of a con­ver­sa­tional in­ter­face are im­mense, and we are work­ing to en­able se­man­tic un­der­stand­ing of the en­tirety of GitHub across pub­lic and pri­vate knowl­edge bases to bet­ter per­son­al­ize GitHub Copilot’s an­swers for or­ga­ni­za­tions, teams, com­pa­nies, and in­di­vid­ual de­vel­op­ers alike based on their code­base and doc­u­men­ta­tion.

Moving for­ward, we are ex­plor­ing the best ways to in­dex re­sources be­yond doc­u­men­ta­tion such as is­sues, pull re­quests, dis­cus­sions, and wikis to give de­vel­op­ers every­thing they need to an­swer tech­ni­cal ques­tions.

Our work to re­think pull re­quests and doc­u­men­ta­tion is pow­ered by OpenAI’s newly re­leased GPT-4 AI model.

Even though this model was just re­leased, we’re al­ready see­ing sig­nif­i­cant gains in log­i­cal rea­son­ing and code gen­er­a­tion. With GPT-4, the state of AI is be­gin­ning to catch up with our am­bi­tion to cre­ate an AI pair pro­gram­mer that as­sists with every de­vel­op­ment task at every point in the de­vel­oper ex­pe­ri­ence.

Moreover, it’s help­ing GitHub Copilot un­der­stand more of a de­vel­op­er’s code­base to of­fer more tai­lored sug­ges­tions in PRs and bet­ter sum­ma­tions of doc­u­men­ta­tion.

* Copilot for the com­mand line in­ter­face (CLI): Next to the ed­i­tor and pull re­quest, the ter­mi­nal is the place where de­vel­op­ers spend the most time. But even the most pro­fi­cient de­vel­op­ers need to scroll through many pages to re­mem­ber the pre­cise syn­tax of many com­mands. This is why we are launch­ing GitHub Copilot CLI. It can com­pose com­mands and loops, and throw around ob­scure find flags to sat­isfy your query.

From read­ing docs to writ­ing code to sub­mit­ting pull re­quests and be­yond, we’re work­ing to per­son­al­ize GitHub Copilot for every team, pro­ject, and repos­i­tory it’s used in, cre­at­ing a rad­i­cally im­proved soft­ware de­vel­op­ment life­cy­cle. Together with Microsoft’s knowl­edge model, we will har­ness the reser­voir of data and in­sights that ex­ist in every or­ga­ni­za­tion, to strengthen the con­nec­tion be­tween all work­ers and de­vel­op­ers, so every idea can go from code to re­al­ity with­out fric­tion. At the same time, we will con­tinue to in­no­vate and up­date the heart of GitHub Copilot—the AI pair pro­gram­mer that started it all.

GitHub Copilot X is on the hori­zon, and with it a new gen­er­a­tion of more pro­duc­tive, ful­filled, and happy de­vel­op­ers who will ship bet­ter soft­ware for every­one. So—let’s build from here.

...

Read the original on github.blog »

3 996 shares, 41 trendiness

So you've installed `fzf`. Now what?

Software en­gi­neers are, if not unique, then darn near

unique in the ease with which we can cre­ate tools to im­prove our own pro­fes­sional lives; this how­ever can come at a steep cost over time for peo­ple who con­stantly flit back and forth be­tween dif­fer­ent tools with­out in­vest­ing the time to learn their own kit in depth. As some­one with a healthy re­spect for the tacit knowl­edge of peo­ple bet­ter than

me, I think a great 80/20 heuris­tic is Learn the oldies first”: ven­er­a­ble Unix tools like cat, ls, cd, grep, and cut. (sed and awk, too, if you have the good for­tune of land­ing your­self in an ac­tual mod­ern sysad­min role.)

But there are tools whose re­turn on in­vest­ment is so im­me­di­ate, and whose value prop is so unique, that the 80/20 heuris­tic breaks down en­tirely for them. fzf is one of them. And it makes me sad to see so many peo­ple down­load it, run it as-is at the com­mand line, and then just shake their heads and walk away, say­ing I don’t get it”.

Here I want to change that. Pretend you live on a more-or-less stan­dard Ubuntu box. You’ve just in­stalled

fzf us­ing the stan­dard in­stall script — now what?

In most ter­mi­nals, Linux and Windows alike, Ctrl+R gives you back­wards search for your com­mands. The rea­son you, like me, may not have heard about this un­til you had al­ready been hack­ing away for ten flip­pin’ years at the shell is be­cause the base ver­sion kind of sucks for 2 rea­sons:

You need to give an ex­act match to get what you’re

try­ing to re­mem­ber.

You get only one pre­view, so if you miss that ex­act

match by even one char­ac­ter, you’re on a wild goose chase.

fzf is a bit of a weird pro­gram be­cause in­stalling it

ac­tu­ally over­writes a whole bunch of key­board short­cuts, in the in­ter­est of mak­ing them bet­ter. Normally I would hate this. But…

… This is a con­sid­er­able im­prove­ment on the base­line.

Let’s say you boot into an empty ter­mi­nal. You’re try­ing to quickly find your nascent SaaS side hus­tle re­pos and cd to it - but it’s been weeks since you’ve been there, your ac­tual full time job has been un­usu­ally fun and en­gag­ing… How do you find it?

Answer: With fzf. fzf rewrites Alt+C into a souped-up fuzzy-cd short­cut that lets you hop around very quickly when all you re­mem­ber is the vague name of the di­rec­tory in ques­tion.

Okay, we’ve got the short­cuts out of the way. Honestly these two guys alone pro­vide the ma­jor­ity of the value I get out of fzf - but let’s look at what the com­mand, by it­self, does.

It fuzzy-finds file lo­ca­tions! Relative ones, at least, to your own di­rec­tory. This… is­n’t that use­ful, by it­self.

And you get a fuzzy-open-in-ed­i­tor ex­pe­ri­ence!

The other day I was try­ing to hack to­gether baby’s first live-re­load with a Firefox ex­ten­sion, entr, and ng­inx. And I found my­self ask­ing: Where the heck is ng­inx.conf?

Use my half-re­mem­bered knowl­edge of the FHS to guess around, with trees and greps, or

Just know and com­mit it to mem­ory and feel su­pe­rior to every­one else, or

Just pipe find . /’ to fzf and start search­ing.

I like this clip a lot be­cause it shows some of the sub­tle trade­offs of us­ing fzf, as well as one of the more ad­vanced search­ing fea­tures - search­ing for conf$ will fil­ter out any line that does­n’t end in conf. Notice that fzf tem­porar­ily wigs out when find hits a whole lot of Permission de­nied” er­rors - but then re­cov­ers it­self a few sec­onds later.

Are those ex­tra few sec­onds worth the trade­off for be­ing able to find con­fig files in such a brain­dead man­ner? It is for me.

Thanks to sig­mon­says, Hacker News, for re­mind­ing me of this fea­ture!

About halfway be­tween overwrite a key­board short­cut” and use fzf as-is” is

us­ing two stars for fuzzy tab com­ple­tion. Here’s us­ing it to do some­thing quite sim­i­lar to vi $(fzf), as above:

You do have to hit Enter one more time af­ter you ac­tu­ally get the com­mand, fair warn­ing.

I’m not yet in the habit of us­ing this all that of­ten, since my only real use case at home is as a re­place­ment for $(fzf), and I just find ex­plic­itly call­ing the boy eas­ier to re­mem­ber. I imag­ine it’s a sim­i­lar ex­pe­ri­ence for tab-tab-star-heads as watch­ing my coworker copy-and-paste man­u­ally from the

ter­mi­nal in­stead of us­ing :read ! echo Hello world!” is for me

For when you nei­ther re­meme­ber ex­actly what you’re mov­ing, nor where you’re mov­ing it to, but you re­mem­ber the ab­stract con­cept of dis­tance over time well enough to know it sim­ply must be done, and some­thing ex­tremely spe­cific about the na­ture of each item to be shunted.

Everything I say be­low can be done with grep as well, but the re­cur­sive-by-de­fault na­ture of rg (also known as

rip­grep) is where the tool re­ally comes into its own. I highly rec­om­mend you down­load it and use it for the fol­low­ing ex­am­ples as well. But if you’re tool­shy, don’t worry!

rg . | fzf: Fuzzy search every line in every file

Now we’re get­ting into some real am­ne­siac ter­ri­tory >:3c.

rg . | fzf | cut -d :” -f 1: Fuzzy search every line, in every file, and re­turn the file lo­ca­tion

vim $(rg . | fzf | cut -d :” -f 1): Fuzzy search every line, in every file, and open that file

...

Read the original on andrew-quinn.me »

4 989 shares, 31 trendiness

Cyclists Now Outnumber Motorists In City Of London

Cyclists are now the single largest ve­hic­u­lar mode counted dur­ing peak times on City streets,” says a re­port to the trans­porta­tion com­mit­tee of the City of London Corporation, the mu­nic­i­pal gov­ern­ing body of London’s square mile.

The traf­fic count fig­ures are in a brief­ing doc­u­ment pro­vided to coun­cilors for a com­mit­tee meet­ing next Tuesday.

At peak times, peo­ple cy­cling rep­re­sent 40% of road traf­fic in the City and 27% through­out the day.

Over the last decade, the use of mo­tor ve­hi­cles has been in­creas­ingly re­stricted in the fi­nan­cial heart of the U. K. The 24-hour traf­fic count was con­ducted on a wet and windy November day last year.

Walking re­mains the main way peo­ple travel on the City’s streets, says the re­port to coun­cilors. However, the num­ber of pedes­tri­ans is cur­rently be­low pre-pan­demic fig­ures, with the vol­umes of mo­tor ve­hi­cles also 80% of what they were in 2019.

However, cy­clist num­bers are at 102% of pre-pan­demic lev­els. The num­ber of mo­torists has fallen by 64% since 1999, while the num­ber of cy­clists has in­creased by 386%.

Long-term trends ob­served from count data taken from 12 sites across the City since 1999 show mo­tor ve­hi­cle vol­umes con­tin­u­ing to de­cline and cy­cle vol­umes con­tin­u­ing to in­crease,” says the traf­fic or­der pa­per to coun­cilors, due to be dis­cussed on 7 March.

The on­line pub­li­ca­tion of the ma­te­ri­als was spot­ted by Twitter user @lastnotlost.

Apart from dur­ing the pan­demic, the most sig­nif­i­cant per­cent­age drops in mo­tor ve­hi­cle use were be­tween 2007-2009 and 2014-16, re­veals the brief­ing doc­u­ment.

Danny Williams, the CEO of arms-reach gov­ern­ment body Active Travel England, said the con­sid­er­able uptick in cy­cling lev­els in the City of London was quite as­ton­ish­ing.”

...

Read the original on www.forbes.com »

5 811 shares, 29 trendiness

DPReview.com to close

...

Read the original on www.dpreview.com »

6 787 shares, 30 trendiness

A.M. Turing Award

ACM, the Association for Computing Machinery, to­day named Bob Metcalfe as re­cip­i­ent of the 2022 ACM A. M. Turing Award for the in­ven­tion, stan­dard­iza­tion, and com­mer­cial­iza­tion of Ethernet.

Metcalfe is an Emeritus Professor of Electrical and Computer Engineering (ECE) at The University of Texas at Austin and a Research Affiliate in Computational Engineering at the Massachusetts Institute of Technology (MIT) Computer Science & Artificial Intelligence Laboratory (CSAIL).

The ACM A. M. Turing Award, of­ten re­ferred to as the Nobel Prize in Computing,” car­ries a $1 mil­lion prize, with fi­nan­cial sup­port pro­vided by Google, Inc. The Award is named for Alan M. Turing, the British math­e­mati­cian who ar­tic­u­lated the math­e­mat­i­cal foun­da­tions of com­put­ing.

In 1973, while a com­puter sci­en­tist at the Xerox Palo Alto Research Center (PARC), Metcalfe cir­cu­lated a now-fa­mous memo de­scrib­ing a broadcast com­mu­ni­ca­tion net­work” for con­nect­ing some of the first per­sonal com­put­ers, PARCs Altos, within a build­ing. The first Ethernet ran at 2.94 megabits per sec­ond, which was about 10,000 times faster than the ter­mi­nal net­works it would re­place.

Although Metcalfe’s orig­i­nal de­sign pro­posed im­ple­ment­ing this net­work over coax­ial ca­ble, the memo en­vi­sioned communication over an ether,” mak­ing the de­sign adapt­able to fu­ture in­no­va­tions in me­dia tech­nol­ogy in­clud­ing legacy tele­phone twisted pair, op­ti­cal fiber, ra­dio (Wi-Fi), and even power net­works, to re­place the coax­ial ca­ble as the ether.” That memo laid the ground­work for what we now know to­day as Ethernet.

Metcalfe’s Ethernet de­sign in­cor­po­rated in­sights from his ex­pe­ri­ence with ALOHAnet, a pi­o­neer­ing com­puter net­work­ing sys­tem de­vel­oped at the Uni­ver­sity of Hawaii. Metcalfe re­cruited David Boggs (d. 2022), a co-in­ven­tor of Ethernet, to help build a 100-node PARC Ethernet. That first Ethernet was then repli­cated within Xerox to pro­lif­er­ate a cor­po­rate in­ter­net.

In their clas­sic 1976 Communications of the ACM ar­ti­cle, Ethernet: Distributed Packet Switching for Local Computer Networks,” Metcalfe and Boggs de­scribed the de­sign of Ethernet. Metcalfe then led a team that de­vel­oped the 10Mbps Ethernet to form the ba­sis of sub­se­quent stan­dards.

After leav­ing Xerox in 1979, Metcalfe re­mained the chief evan­ge­list for Ethernet and con­tin­ued to guide its de­vel­op­ment while work­ing to en­sure in­dus­try adop­tion of an open stan­dard. He led an ef­fort by Digital Equipment Corporation (DEC), Intel, and Xerox to de­velop a 10Mbps Ethernet spec­i­fi­ca­tion—the DIX stan­dard. The IEEE 802 com­mit­tee was formed to es­tab­lish a lo­cal area net­work (LAN) stan­dard. A slight vari­ant of DIX be­came the first IEEE 802.3 stan­dard, which is still vi­brant to­day.

As the founder of his own Silicon Valley Internet startup, 3Com Corporation, in 1979, Metcalfe bol­stered the com­mer­cial ap­peal of Ethernet by sell­ing net­work soft­ware, Ethernet trans­ceivers, and Ethernet cards for mini­com­put­ers and work­sta­tions. When IBM in­tro­duced its per­sonal com­puter (PC), 3Com in­tro­duced one of the first Ethernet in­ter­faces for IBM PCs and their pro­lif­er­at­ing clones.

Today, Ethernet is the main con­duit of wired net­work com­mu­ni­ca­tions around the world, han­dling data rates from 10 Mbps to 400 Gbps, with 800 Gbps and 1.6 Tbps tech­nolo­gies emerg­ing. Ethernet has also be­come an enor­mous mar­ket, with rev­enue from Ethernet switches alone ex­ceed­ing $30 bil­lion in 2021, ac­cord­ing to the International Data Corporation.

Metcalfe in­sists on call­ing Wi-Fi by its orig­i­nal name, Wireless Ethernet, for old times’ sake.

Ethernet has been the dom­i­nant way of con­nect­ing com­put­ers to other de­vices, to each other, and to the Internet,” ex­plains ACM President Yannis Ioannidis. Metcalfe’s orig­i­nal de­sign ideas have en­abled the band­width of Ethernet to grow geo­met­ri­cally. It is rare to see a tech­nol­ogy scale from its ori­gins to to­day’s multi­gi­ga­bit-per-sec­ond ca­pac­ity. Even with the ad­vent of WiFi, Ethernet re­mains the sta­ple mode of data com­mu­ni­ca­tion, es­pe­cially when se­cu­rity and re­li­a­bil­ity are pri­or­i­tized. It is es­pe­cially fit­ting to rec­og­nize such an im­pact­ful in­ven­tion dur­ing its 50th an­niver­sary year.”

Ethernet is the foun­da­tional tech­nol­ogy of the Internet, which sup­ports more than 5 bil­lion users and en­ables much of mod­ern life,” added Jeff Dean, Google Senior Fellow and SVP of Google Research and AI. Today, with an es­ti­mated 7 bil­lion ports around the globe, Ethernet is so ubiq­ui­tous that we take it for granted. It’s easy to for­get that our in­ter­con­nected world would not be the same if not for Bob Metcalfe’s in­ven­tion and his en­dur­ing vi­sion that every com­puter needed to be net­worked.”

Metcalfe will be for­mally pre­sented with the ACM A. M. Turing Award at the an­nual ACM Awards Banquet, which will be held this year on Saturday, June 10 at the Palace Hotel in San Francisco.

Robert Melancton Metcalfe is Emeritus Professor of Electrical and Computer Engineering (ECE) af­ter 11 years at The University of Texas at Austin. He has re­cently be­come a Research Affiliate in Computational Engineering at his alma mater, the Massachusetts Institute of Technology (MIT) Computer Science & Artificial Intelligence Laboratory (CSAIL).

Metcalfe grad­u­ated from MIT in 1969 with Bachelor de­grees in Electrical Engineering and Industrial Management. He earned a Master’s de­gree in Applied Mathematics in 1970 and a PhD in Computer Science in 1973 from Harvard University.

Metcalfe’s hon­ors in­clude the National Medal of Technology, IEEE Medal of Honor, Marconi Prize, Japan Computer & Communications Prize, ACM Grace Murray Hopper Award, and IEEE Alexander Graham Bell Medal. He is a Fellow of the US National Academy of Engineering, the American Academy of Arts and Sciences, and the National Inventors, Consumer Electronics, and Internet Halls of Fame.

The A. M. Turing Award, the ACMs most pres­ti­gious tech­ni­cal award, is given for ma­jor con­tri­bu­tions of last­ing im­por­tance to com­put­ing.

This site cel­e­brates all the win­ners since the award’s cre­ation in 1966. It con­tains bi­o­graph­i­cal in­for­ma­tion, a de­scrip­tion of their ac­com­plish­ments, straight­for­ward ex­pla­na­tions of their fields of spe­cial­iza­tion, and text or video of their A. M. Turing Award Lecture.

...

Read the original on amturing.acm.org »

7 750 shares, 60 trendiness

The FTC wants to ban those tough-to-cancel gym and cable subscriptions

/ Sign up for Verge Deals to get deals on prod­ucts we’ve tested sent to your in­box daily.

...

Read the original on www.theverge.com »

8 727 shares, 26 trendiness

Little Snitch Mini

Your Mac can be quite a chatty fel­low. Talking to strangers all over the world.

You de­serve to know to whom your Apps are talk­ing to.

With Little Snitch Mini, you can.

Your apps con­nect when­ever they want to wher­ever they want. , only if you want.

It shows you each and every Internet con­nec­tion of all apps on your Mac. And if you don’t like what you see, you sim­ply push the Stop-Button.

Say hello to Blocklists!

It’s never been eas­ier to get rid of un­wanted Internet con­nec­tions.

Choose from a cu­rated col­lec­tion of block­lists cov­er­ing thou­sands of ad servers, track­ing servers and much more. They are kept up-to-date au­to­mat­i­cally, for op­ti­mal pro­tec­tion of your pri­vacy.

Learn more…

Blocklists are or­ga­nized in cat­e­gories. So you can quickly find the ones that best suit your needs.

Some of your apps have seen more of the world than you have! They send data to the far­thest cor­ners of our planet.

With the map view of Little Snitch Mini you can fol­low their tracks!

The ver­sa­tile traf­fic chart shows you de­tailed sta­tis­tics of the data amounts sent and re­ceived by your apps over the last 12 months.

What’s go­ing on right now?

The sta­tus menu shows an an­i­mated live overview of your Mac’s most re­cent net­work ac­tiv­ity.

The net­work mon­i­tor­ing func­tion­al­ity, in­clud­ing the real-time con­nec­tion list, traf­fic di­a­grams and the an­i­mated map view can be used for free!

The full fea­ture set, in­clud­ing con­nec­tion block­ing, ex­tended traf­fic his­tory time ranges, ad­vanced dis­play and fil­ter­ing op­tions and more is avail­able as an in-app pur­chase.

...

Read the original on obdev.at »

9 724 shares, 68 trendiness

Introducing the new and upgraded Framework Laptop

Now avail­able with AMD Ryzen™ 7040 Series and 13th Gen Intel® Core™

Performance up­grades for your Framework Laptop 13, with the lat­est Intel and AMD proces­sor op­tions.

Higher-capacity bat­ter­ies, im­proved hinges, matte dis­plays, and more for your Framework Laptop 13.

Get the lat­est news and prod­uct up­dates from Framework

The team over at Framework has man­aged to not just cre­ate a lap­top that is eas­ily re­pairable and upgrad­able, but it‘s also a thin, gor­geous, per­for­mant lap­top.” — Linus Tech Tips

This is the best lap­top you can get right now if you want a com­pletely re­pairable and up­grade­able de­vice.” — Dave2D

Best of the Best Design Award

The time has come for con­sumer elec­tron­ics prod­ucts that are de­signed to last: prod­ucts that give you back the power to up­grade, cus­tomize, and re­pair them. We’re ex­cited for the op­por­tu­nity to fix the con­sumer elec­tron­ics in­dus­try to­gether.

At our Next Level Event, we launched a colos­sal set of new prod­ucts and up­grades, in­clud­ing the new, high-per­for­mance 16” Framework Laptop 16 and both Intel and AMD-powered Framework Laptop 13. We can’t wait to see what you think!

...

Read the original on frame.work »

10 708 shares, 28 trendiness

Hyundai Promises To Keep Buttons in Cars Because Touchscreen Controls Are Dangerous

Hyundai Promises To Keep Buttons in Cars Because Touchscreen Controls Are DangerousHyundai knows you like to keep your eyes on the road, and it’s giv­ing you the con­trols to do just that.

Touchscreens and touch con­trols took over the world of au­to­mo­tive in­te­rior de­sign, as au­tomak­ers aimed to build ve­hi­cles on the cut­ting edge of tech­nol­ogy and trends. As it turns out though, some­times the old ways are best. Hyundai cer­tainly thinks so, as it has pledged to em­ploy real phys­i­cal but­tons in prod­ucts to come. Sang Yup Lee, Hyundai’s head of de­sign, re­it­er­ated the com­pa­ny’s com­mit­ment to but­tons at the in­tro­duc­tion of the new Hyundai Kona. As re­ported by CarsGuide, for the Korean au­tomaker, it’s a de­ci­sion rooted in safety con­cerns. We have used the phys­i­cal but­tons quite sig­nif­i­cantly the last few years. For me, the safety-re­lated but­tons have to be a hard key,” said Lee. That’s a real vol­ume knob in the new Kona’s in­te­rior, along with phys­i­cal con­trols for the HVAC sys­tem, too. HyundaiIt’s a de­sign call that makes a lot of sense. In some mod­ern ve­hi­cles, ad­just­ing things like the vol­ume or cli­mate con­trol set­tings can re­quire div­ing into menus on a touch screen, or us­ing your eyes to find a touch con­trol on the dash. In com­par­i­son, the tac­tile feed­back of real but­tons, di­als, and switches lets dri­vers keep their eyes on the road in­stead. When you’re dri­ving, it’s hard to con­trol it. This is why when it’s a hard key it’s easy to sense and feel it,” said Lee. As far as he is con­cerned, phys­i­cal con­trols are a ne­ces­sity for any­thing that could im­pact safety. Hence the phys­i­cal but­tons and di­als for items like the HVAC sys­tem and vol­ume con­trol. Lee hinted that while this is a pri­or­ity for Hyundai to­day, things may change in fu­ture. In par­tic­u­lar, the com­pany will likely look at us­ing touch con­trols more heav­ily when au­tonomous dri­ving be­comes main­stream.  “When it comes to Level 4 au­tonomous dri­ving, then we’ll have every­thing soft key,” said Lee. Touchscreens and touch con­trols did of­fer cer­tain en­tice­ments to au­tomak­ers. They al­low a great deal of func­tions to be con­trolled with a com­pact, change­able in­ter­face. A hand­ful of touch con­trols and a touch­screen can also be cheaper and eas­ier to im­ple­ment than pop­u­lat­ing but­tons all over the cabin. Plus, for a time, they were a sign that an au­tomaker was mov­ing with the times. After the past decade, though, the peo­ple grow tired of such nov­el­ties. Touch con­trols are, by and large, less re­spon­sive and less prac­ti­cal than the sim­ple but­tons of yore. That’s be­fore we even con­tem­plate the frus­tra­tion of hav­ing to dive into a menu sys­tem just to turn on a heated seat. Hyundai has clearly iden­ti­fied the old-school ethos best suits its own in­te­ri­ors, and it is­n’t afraid to say so.Got a tip? Let the au­thor know: lewin@thedrive.com

Sign Up For Our NewsletterTechnology, per­for­mance and de­sign de­liv­ered to your in­box. Sign Up

...

Read the original on www.thedrive.com »

To add this web app to your iOS home screen tap the share button and select "Add to the Home Screen".

10HN is also available as an iOS App

If you visit 10HN only rarely, check out the the best articles from the past week.

If you like 10HN please leave feedback and share

Visit pancik.com for more.