10 interesting stories served every morning and every evening.




1 1,337 shares, 7 trendiness, 649 words and 8 minutes reading time

The Computer Scientist Responsible For Cut, Copy, and Paste, Has Passed Away

The ad­vent of the per­sonal com­puter was­n’t just about mak­ing these pow­er­ful ma­chines avail­able to every­one, it was also about mak­ing them ac­ces­si­ble and us­able, even for those lack­ing a com­puter sci­ence de­gree. Larry Tesler, who passed away on Monday, might not be a house­hold name like Steve Jobs or Bill Gates, but his con­tri­bu­tions to mak­ing com­put­ers and mo­bile de­vices eas­ier to use are the high­light of a long ca­reer in­flu­enc­ing mod­ern com­put­ing.

Born in 1945 in New York, Tesler went on to study com­puter sci­ence at Stanford University, and af­ter grad­u­a­tion he dab­bled in ar­ti­fi­cial in­tel­li­gence re­search (long be­fore it be­came a deeply con­cern­ing tool) and be­came in­volved in the anti-war and anti-cor­po­rate mo­nop­oly move­ments, with com­pa­nies like IBM as one of his de­serv­ing tar­gets. In 1973 Tesler took a job at the Xerox Palo Alto Research Center (PARC) where he worked un­til 1980. Xerox PARC is fa­mously known for de­vel­op­ing the mouse-dri­ven graph­i­cal user in­ter­face we now all take for granted, and dur­ing his time at the lab Tesler worked with Tim Mott to cre­ate a word proces­sor called Gypsy that is best known for coin­ing the terms cut,” copy,” and paste” when it comes to com­mands for re­mov­ing, du­pli­cat­ing, or repo­si­tion­ing chunks of text.

Xerox PARC is also well known for not cap­i­tal­iz­ing on the ground­break­ing re­search it did in terms of per­sonal com­put­ing, so in 1980 Tesler tran­si­tioned to Apple Computer where he worked un­til 1997. Over the years he held count­less po­si­tions at the com­pany in­clud­ing Vice President of AppleNet (Apple’s in-house lo­cal area net­work­ing sys­tem that was even­tu­ally can­celed), and even served as Apple’s Chief Scientist, a po­si­tion that at one time was held by Steve Wozniak, be­fore even­tu­ally leav­ing the com­pany.

In ad­di­tion to his con­tri­bu­tions to some of Apple’s most fa­mous hard­ware, Tesler was also known for his ef­forts to make soft­ware and user in­ter­faces more ac­ces­si­ble. In ad­di­tion to the now ubiq­ui­tous cut,” copy,” and paste” ter­mi­nolo­gies, Tesler was also an ad­vo­cate for an ap­proach to UI de­sign known as mod­e­less com­put­ing, which is re­flected in his per­sonal web­site. In essence, it en­sures that user ac­tions re­main con­sis­tent through­out an op­er­at­ing sys­tem’s var­i­ous func­tions and apps. When they’ve opened a word proces­sor, for in­stance, users now just au­to­mat­i­cally as­sume that hit­ting any of the al­phanu­meric keys on their key­board will re­sult in that char­ac­ter show­ing up on-screen at the cur­sor’s in­ser­tion point. But there was a time when word proces­sors could be switched be­tween mul­ti­ple modes where typ­ing on the key­board would ei­ther add char­ac­ters to a doc­u­ment or al­ter­nately al­low func­tional com­mands to be en­tered.

There are still plenty of soft­ware ap­pli­ca­tions where tools and func­tion­al­ity change de­pend­ing on the mode they’re in (complex apps like Photoshop, for ex­am­ple, where var­i­ous tools be­have dif­fer­ently and per­form very dis­tinct func­tions) but for the most part mod­ern op­er­at­ing sys­tems like Apple’s ma­cOS and Microsoft’s Windows have em­braced user-friend­li­ness through a less com­pli­cated mod­e­less ap­proach.

After leav­ing Apple in 1997, Tesler co-founded a com­pany called Stagecast Software which de­vel­oped ap­pli­ca­tions that made it eas­ier and more ac­ces­si­ble for chil­dren to learn pro­gram­ming con­cepts. In 2001 he joined Amazon and even­tu­ally be­came the VP of Shopping Experience there, in 2005 he switched to Yahoo where he headed up that com­pa­ny’s user ex­pe­ri­ence and de­sign group, and then in 2008 he be­came a prod­uct fel­low at 23andMe. According to his CV, Tesler left 23andMe in 2009 and from then on mostly fo­cused on con­sult­ing work.

While there are un­doubt­edly count­less other con­tri­bu­tions Tesler made to mod­ern com­put­ing as part of his work on teams at Xerox and Apple that may never come to light, his known con­tri­bu­tions are im­mense. Tesler is one of the ma­jor rea­sons com­puter moved out of re­search cen­ters and into homes.

...

Read the original on gizmodo.com »

2 886 shares, 29 trendiness, 1025 words and 11 minutes reading time

Kickstarter Employees Win Historic Union Election

Kickstarter em­ploy­ees voted to form a union with the Office and Professional Employees International Union, which rep­re­sents more than 100,000 white col­lar work­ers. The fi­nal vote was 46 for the union, 37 against, a his­toric win for union­iza­tion ef­forts at tech com­pa­nies.

Kickstarter work­ers are now the first white col­lar work­ers at a ma­jor tech com­pany to suc­cess­fully union­ize in the United States, send­ing a mes­sage to other tech work­ers.

Everyone was cry­ing [when the re­sults were an­nounced],” Clarissa Redwine, a Kickstarter United or­ga­niz­ers who was fired in September, told Motherboard. I thought it would be close, but I also knew we were go­ing to win. I hope other tech work­ers feel em­bold­ened and know that it’s pos­si­ble to fight for your work­place and your val­ues. I know my for­mer cowork­ers will use a seat at the table re­ally well.”

Today we learned that in a 46 to 37 vote, our staff has de­cided to union­ize,” Kickstarter’s CEO Aziz Hasan said in a state­ment. We sup­port and re­spect this de­ci­sion, and we are proud of the fair and de­mo­c­ra­tic process that got us here. We’ve worked hard over the last decade to build a dif­fer­ent kind of com­pany, one that mea­sures its suc­cess by how well it achieves its mis­sion: help­ing to bring cre­ative pro­jects to life. Our mis­sion has been com­mon ground for every­one here dur­ing this process, and it will con­tinue to guide us as we en­ter this new phase to­gether.”

The union at the Brooklyn-based crowd-fund­ing plat­form ar­rives dur­ing a pe­riod of un­prece­dented la­bor or­ga­niz­ing among en­gi­neers and other white col­lar tech work­ers at Google, Amazon, Microsoft and other promi­nent tech com­pa­nies—around is­sues like sex­ual ha­rass­ment, ICE con­tracts, and car­bon emis­sions. Between 2017 and 2019, the num­ber of protest ac­tions led by tech work­ers nearly tripled. In 2019 alone, tech work­ers led more than 100 ac­tions, ac­cord­ing to the on­line data­base Collective Actions in Tech.”

I feel like the most im­por­tant is­sues [for us] are around cre­at­ing clearer poli­cies and sup­port for re­port­ing work­place is­sues and cre­at­ing clearer mech­a­nisms for hir­ing and fir­ing em­ploy­ees,” said RV Dougherty, a for­mer trust and safety an­a­lyst and core or­ga­nizer for Kickstarter United who quit in early February. Right now so much de­pends on what team you’re on and if you have a good re­la­tion­ship with your man­ager… We also have a lot of pay dis­par­ity and folks who are do­ing in­cred­i­ble jobs but have been kept from get­ting pro­moted be­cause they spoke their mind, which is not how Kickstarter should work.”

In the days lead­ing up to Kickstarter vote count, Motherboard re­vealed that Kickstarter hired Duane Morris, a Philadelphia law firm that spe­cial­izes in la­bor man­age­ment re­la­tions and maintaining a union-free work­place.” Kickstarter con­firmed to Motherboard that it first re­tained the ser­vices of Duane Morris in 2018 be­fore it knew about union or­ga­niz­ing at the com­pany, but would not go into de­tail about whether the firm had ad­vised the com­pany on how to de­feat the union and de­nied any union-bust­ing ac­tiv­ity.

Dating back to its 2009 found­ing, Kickstarter has tried to dis­tin­guish it­self as a pro­gres­sive ex­cep­tion to Silicon Valley tech com­pa­nies. In 2015, the com­pa­ny’s lead­er­ship an­nounced it had be­come a public ben­e­fit cor­po­ra­tion.” Benefit Corporations are for-profit com­pa­nies that are ob­lig­ated to con­sider the im­pact of their de­ci­sions on so­ci­ety, not only share­hold­ers,” the se­nior lead­er­ship wrote at the time. The com­pany has been hailed as one of the most eth­i­cal places to work in tech.

Indeed, rather than ded­i­cate its re­sources to max­i­miz­ing profit, Kickstarter has fought for pro­gres­sive causes, like net neu­tral­ity, and against the anti-trans bath­room law in North Carolina.

But in 2018, a heated dis­agree­ment broke out be­tween em­ploy­ees and man­age­ment about whether to leave a pro­ject called Always Punch Nazis” on the plat­form, ac­cord­ing to re­port­ing in Slate. When Breitbart said the pro­ject vi­o­lated Kickstarter’s terms of ser­vice by in­cit­ing vi­o­lence, man­age­ment ini­tially planned to re­move the pro­ject, but then re­versed its de­ci­sion af­ter protest from em­ploy­ees.

Following the con­tro­versy, em­ploy­ees an­nounced their in­ten­tions to union­ize with OPEIU Local 153 in March 2019. And the com­pany made it clear that it did not be­lieve a union was right for Kickstarter.

In a let­ter to cre­ators, Kickstarter’s CEO Aziz Hasan wrote in September that The union frame­work is in­her­ently ad­ver­sar­ial.”

That dy­namic does­n’t re­flect who we are as a com­pany, how we in­ter­act, how we make de­ci­sions, or where we need to go,” the com­pa­ny’s CEO Aziz Hasan wrote to cre­ators in September. We be­lieve that in many ways it would set us back.”

In September, Kickstarter fired two em­ploy­ees on its union or­ga­niz­ing com­mit­tee within 8 days, in­form­ing a third that his role was no longer needed at the com­pany. Following out­cry from promi­nent cre­ators, the com­pany in­sisted that the two fir­ings were re­lated to job per­for­mance, not union ac­tiv­ity.

The two fired work­ers filed fed­eral un­fair la­bor prac­tice charges with the National Labor Relations Board (NLRB), claim­ing the com­pany re­tal­i­ated against them for union or­ga­niz­ing in vi­o­la­tion of the National Labor Relations Act. (Those charges have yet to be re­solved.) Days later, the com­pany de­nied a re­quest from the union, Kickstarter United, for vol­un­tary recog­ni­tion.

The de­ci­sion to union­ize at Kickstarter fol­lows a se­ries of vic­to­ries for union cam­paigns led by blue col­lar tech work­ers. Last year, 80 Google con­trac­tors in Pittsburgh, 2,300 cafe­te­ria work­ers at Google in Silicon Valley, and roughly 40 Spin e-scooter work­ers in San Francisco voted to form the first unions in the tech in­dus­try. In early February, 15 em­ploy­ees at the de­liv­ery app Instacart in Chicago suc­cess­fully union­ized, fol­low­ing a fierce anti-union cam­paign run by man­age­ment.

By some ac­counts, the cur­rent wave of white col­lar tech or­ga­niz­ing be­gan in early 2018 when the San Francisco tech com­pany Lanetix fired its en­tire 14-software en­gi­neer staff af­ter they filed to union­ize with Communications Workers of America (CWA). Later, the com­pany was forced to cough up $775,000 to set­tle un­fair la­bor prac­tice charges.

Update: This story has been up­dated with com­ment from Kickstarter.

...

Read the original on www.vice.com »

3 855 shares, 32 trendiness, 1112 words and 9 minutes reading time

Enjoy The Extra Day Off! More Bosses Give 4-Day Workweek A Try

Companies around the world are em­brac­ing what might seem like a rad­i­cal idea: a four-day work­week.

The con­cept is gain­ing ground in places as var­ied as New Zealand and Russia, and it’s mak­ing in­roads among some American com­pa­nies. Employers are see­ing sur­pris­ing ben­e­fits, in­clud­ing higher sales and prof­its.

The idea of a four-day work­week might sound crazy, es­pe­cially in America, where the num­ber of hours worked has been climb­ing and where cell­phones and email re­mind us of our jobs 24/7.

But in some places, the four-day con­cept is tak­ing off like a vi­ral meme. Many em­ploy­ers aren’t just mov­ing to 10-hour shifts, four days a week, as com­pa­nies like Shake Shack are do­ing; they’re go­ing to a 32-hour week — with­out cut­ting pay. In ex­change, em­ploy­ers are ask­ing their work­ers to get their jobs done in a com­pressed amount of time.

Last month, a Washington state sen­a­tor in­tro­duced a bill to re­duce the stan­dard work­week to 32 hours. Russian Prime Minister Dmitry Medvedev is back­ing a par­lia­men­tary pro­posal to shift to a four-day week. Politicians in Britain and Finland are con­sid­er­ing some­thing sim­i­lar.

In the U. S., Shake Shack started test­ing the idea a year and a half ago. The burger chain short­ened man­agers’ work­weeks to four days at some stores and found that re­cruit­ment spiked, es­pe­cially among women.

Shake Shack’s pres­i­dent, Tara Comonte, says the staff loved the perk: Being able to take their kids to school a day a week, or one day less of hav­ing to pay for day care, for ex­am­ple.”

So the com­pany re­cently ex­panded its trial to a third of its 164 U. S. stores. Offering that ben­e­fit re­quired Shake Shack to find time sav­ings else­where, so it switched to com­puter soft­ware to track sup­plies of ground beef, for ex­am­ple.

It was a way to in­crease flex­i­bil­ity,” Comonte says of the shorter week. Corporate en­vi­ron­ments have had flex­i­ble work poli­cies for a while now. That’s not so easy to do in the restau­rant busi­ness.”

Hundreds — if not thou­sands — of other com­pa­nies are also adopt­ing or test­ing the four-day week. Last sum­mer, Microsoft’s trial in Japan led to a 40% im­prove­ment in pro­duc­tiv­ity, mea­sured as sales per em­ployee.

Much of this is thanks to Andrew Barnes, an ar­chae­ol­o­gist by train­ing, who never in­tended to be­come a global evan­ge­list. This was not a jour­ney I ex­pected to be on,” he says.

Barnes is CEO of Perpetual Guardian, New Zealand’s largest es­tate plan­ning com­pany. He spent much of his ca­reer be­liev­ing long hours were bet­ter for busi­ness. But he was also dis­turbed by the toll it took on em­ploy­ees and their fam­i­lies, par­tic­u­larly when it came to men­tal health.

So two years ago, he used Perpetual Guardian and its 240 work­ers as guinea pigs, part­ner­ing with aca­d­e­mic re­searchers in Auckland to mon­i­tor and track the ef­fects of work­ing only four days a week.

Core to this is that peo­ple are not pro­duc­tive for every hour, every minute of the day that they’re in the of­fice,” Barnes says, which means there was lots of dis­trac­tion and wasted time that could be cut.

Simply slash­ing the num­ber and du­ra­tion of meet­ings saved huge amounts of time. Also, he did away with open-floor of­fice plans and saw work­ers spend­ing far less time on so­cial me­dia. All this, he says, made it eas­ier to fo­cus more deeply on the work.

Remarkably, work­ers got more work done while work­ing fewer hours. Sales and prof­its grew. Employees spent less time com­mut­ing, and they were hap­pier.

Barnes says there were other, un­ex­pected ben­e­fits: It nar­rowed work­place gen­der gaps. Women — who typ­i­cally took more time off for care­giv­ing — sud­denly had greater flex­i­bil­ity built into their sched­ule. Men also had more time to help with their fam­i­lies, Barnes says.

The com­pany did­n’t po­lice how work­ers spent their time. But if per­for­mance slipped, the firm could re­vert back to the full-week sched­ule. Barnes says that alone mo­ti­vated work­ers.

The Perpetual Guardian study went vi­ral, and things went hay­wire for Barnes.

Employers — in­clud­ing big multi­na­tion­als — started call­ing, seek­ing ad­vice. Frankly, I could­n’t drink enough cof­fee to deal with the num­ber of com­pa­nies that ap­proached us,” Barnes says.

Demand was so great that he set up a foun­da­tion to pro­mote the four-day work­week. Ironically, in the process, he’s work­ing a lot of over­time.

You only get one chance to change the world. And, it’s my re­spon­si­bil­ity at least, on this one, to see if I can in­flu­ence the world for the bet­ter,” he says.

To date, most of that in­ter­est has not come from American em­ploy­ers.

Peter Cappelli, a pro­fes­sor of man­age­ment at the Wharton School of the University of Pennsylvania, says that’s be­cause the con­cept runs counter to American no­tions of work and cap­i­tal­ism. Unions are less pow­er­ful, and work­ers have less po­lit­i­cal sway than in other coun­tries, he says.

So American com­pa­nies an­swer to share­hold­ers, who tend to pri­or­i­tize profit over worker ben­e­fits.

I just don’t see con­tem­po­rary U. S. em­ploy­ers say­ing, You know what, if we cre­ate more value here, we’re gonna give it to the em­ploy­ees.’ I just don’t see that hap­pen­ing,” Cappelli says.

Natalie Nagele, co-founder and CEO of Wildbit, has heard from other lead­ers who say it did­n’t work for them. She says it fails when em­ploy­ees aren’t mo­ti­vated and where man­agers don’t trust em­ploy­ees.

But Nagele says mov­ing her Philadelphia soft­ware com­pany to a four-day week three years ago has been a suc­cess.

We had shipped more fea­tures than we had in re­cent years, we felt more pro­duc­tive, the qual­ity of our work in­creased. So then we just kept go­ing with it,” Nagele says. Personally, she says, it gives her time to rest her brain, which helps solve com­plex prob­lems: You can ask my team, there’s mul­ti­ple times where some­body is like, On Sunday morn­ing, I woke up and … I fig­ured it out.’

Mikeal Parlow started work­ing a four-day week about a month ago. It was a perk of his new job as a bud­get an­a­lyst in Westminster, Colo.

He works 10 hours a day, Monday through Thursday. Or, as he puts it, un­til the job is done. Parlow says he much prefers the new way because it is about get­ting your work done, more so than feed­ing the clock.”

That frees Fridays up for life’s many de­light­ful chores — like vis­its to the DMV. For in­stance, to­day we’re go­ing to go and get our li­cense plates,” Parlow says.

But that also leaves time on the week­ends … for the week­end.

...

Read the original on www.npr.org »

4 818 shares, 33 trendiness, 60 words and 1 minutes reading time

VGraupera/1on1-questions

Use Git or check­out with SVN us­ing the web URL.

Want to be no­ti­fied of new re­leases in VGraupera/1on1-questions?

If noth­ing hap­pens, down­load GitHub Desktop and try again.

If noth­ing hap­pens, down­load GitHub Desktop and try again.

If noth­ing hap­pens, down­load Xcode and try again.

If noth­ing hap­pens, down­load the GitHub ex­ten­sion for Visual Studio and try again.

...

Read the original on github.com »

5 770 shares, 28 trendiness, 785 words and 8 minutes reading time

Pay Up, Or We’ll Make Google Ban Your Ads — Krebs on Security

A new email-based ex­tor­tion scheme ap­par­ently is mak­ing the rounds, tar­get­ing Web site own­ers serv­ing ban­ner ads through Google’s AdSense pro­gram. In this scam, the fraud­sters de­mand bit­coin in ex­change for a promise not to flood the pub­lish­er’s ads with so much bot and junk traf­fic that Google’s au­to­mated anti-fraud sys­tems sus­pend the user’s AdSense ac­count for sus­pi­cious traf­fic.

Earlier this month, KrebsOnSecurity heard from a reader who main­tains sev­eral sites that re­ceive a fair amount of traf­fic. The mes­sage this reader shared be­gan by quot­ing from an au­to­mated email Google’s sys­tems might send if they de­tect your site is seek­ing to ben­e­fit from au­to­mated clicks. The mes­sage con­tin­ues:

Very soon the warn­ing no­tice from above will ap­pear at the dash­board of your AdSense ac­count un­doubt­edly! This will hap­pen due to the fact that we’re about to flood your site with huge amount of di­rect bot gen­er­ated web traf­fic with 100% bounce ra­tio and thou­sands of IPs in ro­ta­tion — a night­mare for every AdSense pub­lisher. More also we’ll ad­just our so­phis­ti­cated bots to open, in end­less cy­cle with dif­fer­ent time du­ra­tion, every AdSense ban­ner which runs on your site.”

The mes­sage goes on to warn that while the tar­geted site’s ad rev­enue will be briefly in­creased, AdSense traf­fic as­sess­ment al­go­rithms will de­tect very fast such a web traf­fic pat­tern as fraud­u­lent.”

Next an ad serv­ing limit will be placed on your pub­lisher ac­count and all the rev­enue will be re­funded to ad­ver­tis­ers. This means that the main source of profit for your site will be tem­porar­ily sus­pended. It will take some time, usu­ally a month, for the AdSense to lift your ad ban, but if this hap­pens we will have all the re­sources needed to flood your site again with bad qual­ity web traf­fic which will lead to sec­ond AdSense ban that could be per­ma­nent!”

The mes­sage de­mands $5,000 worth of bit­coin to fore­stall the at­tack. In this scam, the ex­tor­tion­ists are likely bet­ting that some pub­lish­ers may see pay­ing up as a cheaper al­ter­na­tive to hav­ing their main source of ad­ver­tis­ing rev­enue evap­o­rate.

The reader who shared this email said while he con­sid­ered the mes­sage likely to be a base­less threat, a re­view of his re­cent AdSense traf­fic sta­tis­tics showed that de­tec­tions in his AdSense in­valid traf­fic re­port” from the past month had in­creased sub­stan­tially.

The reader, who asked not to be iden­ti­fied in this story, also pointed to ar­ti­cles about a re­cent AdSense crack­down in which Google an­nounced it was en­hanc­ing its de­fenses by im­prov­ing the sys­tems that iden­tify po­ten­tially in­valid traf­fic or high risk ac­tiv­i­ties be­fore ads are served.

Google de­fines in­valid traf­fic as clicks or im­pres­sions gen­er­ated by pub­lish­ers click­ing their own live ads,” as well as automated click­ing tools or traf­fic sources.”

Pretty con­cern­ing, thought it seems this group is only say­ing they’re plan­ning their at­tack,” the reader wrote.

Google de­clined to dis­cuss this read­er’s ac­count, say­ing its con­tracts pre­vent the com­pany from com­ment­ing pub­licly on a spe­cific part­ner’s sta­tus or en­force­ment ac­tions. But in a state­ment shared with KrebsOnSecurity, the com­pany said the mes­sage ap­pears to be a clas­sic threat of sab­o­tage, wherein an ac­tor at­tempts to trig­ger an en­force­ment ac­tion against a pub­lisher by send­ing in­valid traf­fic to their in­ven­tory.

We hear a lot about the po­ten­tial for sab­o­tage, it’s ex­tremely rare in prac­tice, and we have built some safe­guards in place to pre­vent sab­o­tage from suc­ceed­ing,” the state­ment ex­plained. For ex­am­ple, we have de­tec­tion mech­a­nisms in place to proac­tively de­tect po­ten­tial sab­o­tage and take it into ac­count in our en­force­ment sys­tems.”

Google said it has ex­ten­sive tools and processes to pro­tect against in­valid traf­fic across its prod­ucts, and that most in­valid traf­fic is fil­tered from its sys­tems be­fore ad­ver­tis­ers and pub­lish­ers are ever im­pacted.

We have a help cen­ter on our web­site with tips for AdSense pub­lish­ers on sab­o­tage,” the state­ment con­tin­ues. There’s also a form we pro­vide for pub­lish­ers to con­tact us if they be­lieve they are the vic­tims of sab­o­tage. We en­cour­age pub­lish­ers to dis­en­gage from any com­mu­ni­ca­tion or fur­ther ac­tion with par­ties that sig­nal that they will drive in­valid traf­fic to their web prop­er­ties. If there are con­cerns about in­valid traf­fic, they should com­mu­ni­cate that to us, and our Ad Traffic Quality team will mon­i­tor and eval­u­ate their ac­counts as needed.”

This en­try was posted on Monday, February 17th, 2020 at 9:13 am and is filed un­der A Little Sunshine, The Coming Storm, Web Fraud 2.0.

You can fol­low any com­ments to this en­try through the RSS 2.0 feed.

You can skip to the end and leave a com­ment. Pinging is cur­rently not al­lowed.

...

Read the original on krebsonsecurity.com »

6 704 shares, 27 trendiness, 8318 words and 77 minutes reading time

This Is the Way

The Mandalorian: This Is the Way

Cinematographers Greig Fraser, ASC, ACS and Barry Baz” Idoine and showrun­ner Jon Favreau em­ploy new tech­nolo­gies to frame the Disney Plus Star Wars se­ries.

Unit pho­tog­ra­phy by François Duhamel, SMPSP, and Melinda Sue Gordon, SMPSP, cour­tesy of Lucasfilm, Ltd.

At top, the Mandalorian Bounty Hunter (played by played by Pedro Pascal) rescues the Child — pop­u­larly de­scribed as baby Yoda.”

This ar­ti­cle is an ex­panded ver­sion of the story that ap­pears in our February, 2020 print mag­a­zine.

A live-ac­tion Star Wars tele­vi­sion se­ries was George Lucas’ dream for many years, but the lo­gis­tics of tele­vi­sion pro­duc­tion made achiev­ing the nec­es­sary scope and scale seem in­con­ceiv­able. Star Wars fans would ex­pect ex­otic, pic­turesque lo­ca­tions, but it sim­ply was­n’t plau­si­ble to take a crew to the deserts of Tunisia or the salt flats of Bolivia on a short sched­ule and lim­ited bud­get. The cre­ative team be­hind The Mandalorian has solved that prob­lem.

For decades, green- and blue­screen com­posit­ing was the go-to so­lu­tion for bring­ing fan­tas­tic en­vi­ron­ments and ac­tors to­gether on the screen. (Industrial Light & Magic did pi­o­neer­ing work with the tech­nol­ogy for the orig­i­nal Star Wars movie.) However, when char­ac­ters are wear­ing highly re­flec­tive cos­tumes, as is the case with Mando (Pedro Pascal), the ti­tle char­ac­ter of The Mandalorian, the re­flec­tion of green- and blue­screen in the wardrobe causes costly prob­lems in post-pro­duc­tion. In ad­di­tion, it’s chal­leng­ing for ac­tors to per­form in a sea of blue,” and for key cre­atives to have in­put on shot de­signs and com­po­si­tion.

This story was orig­i­nally pub­lished in the Feb. 2020 is­sue of AC. Some im­ages are ad­di­tional or al­ter­nate.

In or­der for The Mandalorian to work, tech­nol­ogy had to ad­vance enough that the epic worlds of Star Wars could be ren­dered on an af­ford­able scale by a team whose ac­tual pro­duc­tion foot­print would com­prise a few sound­stages and a small back­lot. An ad­di­tional con­sid­er­a­tion was that the typ­i­cal vi­sual-ef­fects work­flow runs con­cur­rent with pro­duc­tion, and then ex­tends for a lengthy post pe­riod. Even with all the power of con­tem­po­rary dig­i­tal vi­sual-ef­fects tech­niques and bil­lions of com­pu­ta­tions per sec­ond, the process can take up to 12 hours or more per frame. With thou­sands of shots and mul­ti­ple it­er­a­tions, this be­comes a time-con­sum­ing en­deavor. The Holy Grail of vi­sual ef­fects — and a ne­ces­sity for The Mandalorian, ac­cord­ing to co-cin­e­matog­ra­pher and co-pro­ducer Greig Fraser, ASC, ACS — was the abil­ity to do real-time, in-cam­era com­posit­ing on set.

That was our goal,” says Fraser, who had pre­vi­ously ex­plored the Star Wars galaxy while shoot­ing Rogue One: A Star Wars Story (AC Feb. 17).  We wanted to cre­ate an en­vi­ron­ment that was con­ducive not just to giv­ing a com­po­si­tion line-up to the ef­fects, but to ac­tu­ally cap­tur­ing them in real time, photo-real and in-cam­era, so that the ac­tors were in that en­vi­ron­ment in the right light­ing — all at the mo­ment of pho­tog­ra­phy.”

The so­lu­tion was what might be de­scribed as the heir to rear pro­jec­tion — a dy­namic, real-time, photo-real back­ground played back on a mas­sive LED video wall and ceil­ing, which not only pro­vided the pixel-ac­cu­rate rep­re­sen­ta­tion of ex­otic back­ground con­tent, but was also ren­dered with cor­rect cam­era po­si­tional data.

Mando with the Child on his ship.

If the con­tent was cre­ated in ad­vance of the shoot, then pho­tograph­ing ac­tors, props and set pieces in front of this wall could cre­ate fi­nal in-cam­era vi­sual ef­fects — or near” fi­nals, with only tech­ni­cal fixes re­quired, and with com­plete cre­ative con­fi­dence in the com­po­si­tion and look of the shots. On The Mandalorian, this space was dubbed the Volume.” (Technically, a volume” is any space de­fined by mo­tion-cap­ture tech­nol­ogy.)

This con­cept was ini­tially pro­posed by Kim Libreri of Epic Games while he was at Lucasfilm and it has be­come the ba­sis of the tech­nol­ogy that Holy Grail” that makes a live-ac­tion Star Wars tele­vi­sion se­ries pos­si­ble.

In 2014, as Rogue One was ramp­ing up, the con­cept of real-time com­posit­ing was once again dis­cussed. Technology had ma­tured to a new level. Visual ef­fects su­per­vi­sor John Knoll had an early dis­cus­sion with Fraser about this con­cept and the cin­e­matog­ra­pher brought up the no­tion of uti­liz­ing a large LED screen as a light­ing in­stru­ment to in­cor­po­rate in­ter­ac­tive an­i­mated light­ing on the ac­tors and sets dur­ing com­pos­ite pho­tog­ra­phy uti­liz­ing play­back of rough pre­vi­su­al­ized ef­fects on the LED screens. The fi­nal an­i­mated VFX would be added in later; the screens were merely to pro­vide in­ter­ac­tive light­ing to match the an­i­ma­tions.

One of the big prob­lems of shoot­ing blue- and green­screen com­pos­ite pho­tog­ra­phy is the in­ter­ac­tive light­ing,” of­fers Fraser. Often, you’re shoot­ing real pho­tog­ra­phy el­e­ments be­fore the back­grounds are cre­ated and you’re imag­in­ing what the in­ter­ac­tive light­ing will do — and then you have to hope that what you’ve done on set will match what hap­pens in post much later on. If the di­rec­tor changes the back­grounds in post, then the light­ing is­n’t go­ing to match and the fi­nal shot will feel false.”

Director and ex­ec­u­tive pro­ducer Dave Filoni and cin­e­matog­ra­phers Greig Fraser, ASC, ACS (center) and Barry Baz” Idoine (operating cam­era) on the set.

For Rogue One, they built a large cylin­dri­cal LED screen and cre­ated all of the back­grounds in ad­vance for the space bat­tle land­ings on Scarif, Jedha and Eadu and all the cock­pit se­quences in X-Wing and U-Wing space­craft were done in front of that LED wall as the pri­mary source of il­lu­mi­na­tion on the char­ac­ters and sets. Those LED pan­els had a pixel pitch of 9mm (the dis­tance be­tween the cen­ters of the RGB pixel clus­ters on the screen). Unfortunately, with the size of the pixel pitch, they could rarely get it far enough away from the cam­era to avoid moiré and make the im­age ap­pear photo-real, so it was used purely for light­ing pur­poses. However, be­cause the re­place­ment back­grounds were al­ready built and uti­lized on set — the comps were ex­tremely suc­cess­ful and per­fectly matched the dy­namic light­ing.

A fish­eye view look­ing through the gap be­tween the two back walls of the show’s LED-wall sys­tem, known as the Volume.” The dark spot on the Volume ceil­ing is due to a dif­fer­ent model of LED screens used there. The ceil­ing is mostly used for light­ing pur­poses, and if seen on cam­era is re­placed in post.

I went to see Jon and ask him if we would like to do some­thing for Disney’s new stream­ing ser­vice,” Kennedy says. I’ve known that Jon has wanted to do a Star Wars pro­ject for a long time, so we started talk­ing right away about what he could do that would push tech­nol­ogy and that led to a whole con­ver­sa­tion around what could change the pro­duc­tion path; what could ac­tu­ally cre­ate a way in which we could make things dif­fer­ently?”

Favreau had just com­pleted The Jungle Book and was em­bark­ing on The Lion King for Disney — both vi­sual-ef­fects heavy films.

Visual ef­fects su­per­vi­sor Richard Bluff and ex­ec­u­tive cre­ative di­rec­tor and head of ILM Rob Bredow showed Favreau a num­ber of tests that ILM had con­ducted in­clud­ing the tech­nol­ogy of the LED wall from Rogue One. Fraser sug­gested with the ad­vance­ments in LED tech­nol­ogy since Rogue One that this pro­ject could lever­age new pan­els and push the en­ve­lope on real-time, in-cam­era vi­sual ef­fects. Favreau loved the con­cept and de­cided that was the pro­duc­tion path to take.

In the back­ground, ap­pear­ing to float in space, are the mo­tion-track­ing cam­eras peek­ing be­tween the Volume’s wall and ceil­ing.

The pro­duc­tion was look­ing to min­i­mize the amount of green- and blue­screen pho­tog­ra­phy and re­quire­ments of post com­posit­ing to im­prove the qual­ity of the en­vi­ron­ment for the ac­tors. The LED screen pro­vides a con­vinc­ing fac­sim­ile of a real set/​lo­ca­tion and avoids the green void that can be chal­leng­ing for per­form­ers.

I was very en­cour­aged by my ex­pe­ri­ences us­ing sim­i­lar tech­nol­ogy on Jungle Book [AC, May 16], and us­ing vir­tual cam­eras on The Lion King [AC, Aug. 19],” ex­plains Favreau, se­ries cre­ator and ex­ec­u­tive pro­ducer. I had also ex­per­i­mented with a par­tial video wall for the pi­lot episode of The Orville. With the team we had as­sem­bled be­tween our crew, ILM, Magnopus, Epic Games, Profile Studios and Lux Machina, I felt that we had a very good chance at a pos­i­tive out­come.”

The Volume is a dif­fi­cult tech­nol­ogy to un­der­stand un­til you stand there in front of the projection’ on the LED screen, put an ac­tor in front of it, and move the cam­era around,” Fraser says. It’s hard to grasp. It’s not re­ally rear pro­jec­tion; it’s not a TransLite be­cause [it is a real-time, in­ter­ac­tive im­age with 3D ob­jects] and has the proper par­al­lax; and it’s photo-real, not an­i­mated, but it is gen­er­ated through a gam­ing en­gine.”

Idoine (left) shoot­ing on the Volume’s dis­play of the ice-planet Maldo Kreis — one of many of the pro­duc­tion’s en­vi­ron­ment loads” — with di­rec­tor Filoni watch­ing and Karina Silva op­er­at­ing B cam­era. The fix­tures with white, half-dome, ping-pong-style balls on each cam­era are the Sputniks” — in­frared-marker con­fig­u­ra­tions that are seen by the mo­tion-track­ing cam­eras to record the pro­duc­tion cam­er­a’s po­si­tion in 3D space, and to ren­der proper 3D par­al­lax on the Volume wall.

The tech­nol­ogy that we were able to in­no­vate on The Mandalorian would not have been pos­si­ble had we not de­vel­oped tech­nolo­gies around the chal­lenges of Jungle Book and Lion King,” of­fers Favreau. We had used game-en­gine and mo­tion-cap­ture [technology] and real-time set ex­ten­sion that had to be ren­dered af­ter the fact, so real-time ren­der was a nat­ural ex­ten­sion of this ap­proach.”

Barry Baz” Idoine, who worked with Fraser for sev­eral years as a cam­era op­er­a­tor and sec­ond-unit cin­e­matog­ra­pher on fea­tures in­clud­ing Rogue One and Vice (AC Jan. 19), as­sumed cin­e­matog­ra­phy du­ties on The Mandalorian when Fraser stepped away to shoot Denis Villeneuve’s Dune. Idoine ob­serves, The strong ini­tial value is that you’re not shoot­ing in a green-screen world and try­ing to em­u­late the light that will be comped in later — you’re ac­tu­ally shoot­ing fin­ished prod­uct shots. It gives the con­trol of cin­e­matog­ra­phy back to the cin­e­matog­ra­pher.”

The Volume was a curved, 20′-high-by-180′-circumference LED video wall, com­pris­ing 1,326 in­di­vid­ual LED screens of a 2.84mm pixel pitch that cre­ated a 270-degree semi­cir­cu­lar back­ground with a 75′-diameter per­for­mance space topped with an LED video ceil­ing, which was set di­rectly onto the main curve of the LED wall.

At the rear of the Volume, in the 90 re­main­ing de­grees of open area, es­sen­tially behind cam­era,” were two 18′-high-by-20′-wide flat pan­els of 132 more LED screens. These two pan­els were rigged to trav­eler track and chain mo­tors in the stage’s perms, so the walls could be moved into place or flown out of the way to al­low bet­ter ac­cess to the Volume area.

The Volume al­lows us to bring many dif­fer­ent en­vi­ron­ments un­der one roof,” says vi­sual-ef­fects su­per­vi­sor Richard Bluff of ILM. We could be shoot­ing on the lava flats of Nevarro in the morn­ing and in the deserts of Tatooine in the af­ter­noon. Of course, there are prac­ti­cal con­sid­er­a­tions to switch­ing over en­vi­ron­ments, but we [typically did] two en­vi­ron­ments in one day.”

The crew sur­rounds the Mandalorian’s space­craft Razor Crest. Only the fuse­lage and cock­pit are prac­ti­cal set pieces. From this still-cam­era po­si­tion, the com­po­si­tion ap­pears broken,” but from the pro­duc­tion cam­er­a’s per­spec­tive, the en­gines ap­pear in per­fect re­la­tion­ship to the fuse­lage, and track in par­al­lax with the cam­er­a’s move­ment.

A ma­jor­ity of the shots were done com­pletely in cam­era,” Favreau adds. And in cases where we did­n’t get to fi­nal pixel, the post­pro­duc­tion process was short­ened sig­nif­i­cantly be­cause we had al­ready made cre­ative choices based on what we had seen in front of us. Postproduction was mostly re­fin­ing cre­ative choices that we were not able to fi­nal­ize on the set in a way that we deemed photo-real.”

With tra­di­tional rear pro­jec­tion (and front pro­jec­tion), in or­der for the re­sult to look be­liev­able, the cam­era must ei­ther re­main sta­tion­ary or move along a pre­pro­grammed path to match the per­spec­tive of the pro­jected im­age. In ei­ther case, the cam­er­a’s cen­ter of per­spec­tive (the en­trance pupil of the lens, some­times re­ferred to — though in­cor­rectly — as the nodal point) must be pre­cisely aligned with the pro­jec­tion sys­tem to achieve proper per­spec­tive and the ef­fects of par­al­lax. The Mandalorian is hardly the first pro­duc­tion to in­cor­po­rate an im­age-pro­jec­tion sys­tem for in-cam­era com­posit­ing, but what sets its tech­nique apart is its abil­ity to fa­cil­i­tate a mov­ing cam­era.

In the pi­lot episode, the Mandalorian (Pedro Pascal) brings his prey (Horatio Sanz) into cus­tody.

Indeed, us­ing a sta­tion­ary cam­era or one locked into a pre-set move for all of the work in the Volume was sim­ply not ac­cept­able for the needs of this par­tic­u­lar pro­duc­tion. The team there­fore had to find a way to track the cam­er­a’s po­si­tion and move­ment in real-world space, and ex­trap­o­late proper per­spec­tive and par­al­lax on the screen as the cam­era moved. This re­quired in­cor­po­rat­ing mo­tion-cap­ture tech­nol­ogy and a videogame en­gine — Epic Games’ Unreal Engine — that would gen­er­ate proper 3D par­al­lax per­spec­tive in real time.

The lo­ca­tions de­picted on the LED wall were ini­tially mod­eled in rough form by vi­sual-ef­fects artists cre­at­ing 3D mod­els in Maya, to the specs de­ter­mined by pro­duc­tion de­signer Andrew Jones and vi­sual con­sul­tant Doug Chiang. Then, wher­ever pos­si­ble, a pho­togram­me­try team would head to an ac­tual lo­ca­tion and cre­ate a 3D pho­to­graphic scan.

We re­al­ized pretty early on that the best way to get photo-real con­tent on the screen was to pho­to­graph some­thing,” at­tests Visual Effects Supervisor Richard Bluff.

As amaz­ing and ad­vanced as the Unreal Engine’s ca­pa­bil­i­ties were, ren­der­ing fully vir­tual poly­gons on-the-fly did­n’t pro­duce the photo-real re­sult that the film­mak­ers de­manded. In short, 3-D com­puter-ren­dered sets and en­vi­ron­ments were not photo-re­al­is­tic enough to be uti­lized as in-cam­era fi­nal im­ages. The best tech­nique was to cre­ate the sets vir­tu­ally, but then in­cor­po­rate pho­tographs of real-world ob­jects, tex­tures and lo­ca­tions and map those im­ages onto the 3-D vir­tual ob­jects. This tech­nique is com­monly known as tiling or pho­togram­me­try. This is not nec­es­sar­ily a unique or new tech­nique, but the in­cor­po­ra­tion of pho­togram­me­try el­e­ments achieved the goal of cre­at­ing in-cam­era fi­nals.

The Mandolorian makes re­pairs with a rich land­scape dis­played be­hind him.

Additionally, pho­to­graphic scanning” of a lo­ca­tion, which in­cor­po­rates tak­ing thou­sands of pho­tographs from many dif­fer­ent view­points to gen­er­ate a 3-D pho­to­graphic model, is a key com­po­nent in cre­at­ing the vir­tual en­vi­ron­ments.

Enrico Damm be­came the Environment Supervisor for the pro­duc­tion and led the scan­ning and pho­togram­me­try team that would travel to lo­ca­tions such as Iceland and Utah to shoot el­e­ments for the Star Wars plan­ets.

The per­fect weather con­di­tion for these pho­to­graphic cap­tures is a heav­ily over­cast day, as there are lit­tle to no shad­ows on the land­scape. A sit­u­a­tion with harsh sun­light and hard shad­ows means that it can­not eas­ily be re-lit in the vir­tual world. In those cases, soft­ware such as Agisoft De-Lighter was used to an­a­lyze the pho­tographs for light­ing and re­move shad­ows to re­sult in a more neu­tral can­vas for vir­tual light­ing.

Scanning is a faster, looser process than pho­togram­me­try and it is done from mul­ti­ple po­si­tions and view­points. For scan­ning, the more par­al­lax in­tro­duced, the bet­ter the soft­ware can re­solve the 3-D geom­e­try. Damm cre­ated a cus­tom rig where the scan­ner straps six cam­eras to their body which all fire si­mul­ta­ne­ously as the scan­ner moves about the lo­ca­tion. This al­lows them to gather six times the im­ages in the same amount of time — about 1,800 on av­er­age.

Photogrammetry is used to cre­ate vir­tual back­drops and im­ages must be shot on a nodal rig to elim­i­nate par­al­lax be­tween the pho­tos. For Mandalorian, about 30-40% of the Volume’s back­drops were cre­ated via vir­tual back­drops — pho­togram­me­try im­ages.

Each phase of pho­tog­ra­phy — pho­togram­me­try and scan­ning — needs to be done at var­i­ous times dur­ing the day to cap­ture dif­fer­ent looks to the land­scape.

Lidar scan­ning sys­tems are some­times also em­ployed.

The cam­eras used for scan­ning were Canon EOS 5D MKIV and EOS 5DS with prime lenses. Zooms are some­times in­cor­po­rated as mod­ern stitch­ing soft­ware has got­ten bet­ter about solv­ing mul­ti­ple im­ages from dif­fer­ent fo­cal lengths.

The Mandalorian (aka Mando,” played by Pedro Pascal) treks through the desert alone.

This in­for­ma­tion was mapped onto 3D vir­tual sets and then mod­i­fied or em­bell­ished as nec­es­sary to ad­here to the Star Wars de­sign aes­thetic. If there was­n’t a real-world lo­ca­tion to pho­to­graph, the en­vi­ron­ments were cre­ated en­tirely by ILMs environments” vi­sual-ef­fects team. The el­e­ments of the lo­ca­tions were loaded into the Unreal Engine video game plat­form, which pro­vided a live, real-time, 3D en­vi­ron­ment that could re­act to the cam­er­a’s po­si­tion.

The third shot of Season 1’s first episode demon­strates this tech­nol­ogy with ex­treme ef­fec­tive­ness. The shot starts with a low an­gle of Mando read­ing a sen­sor on the icy planet of Maldo Kreis; he stands on a long walk­way that stretches out to a se­ries of struc­tures on the hori­zon. The skies are full of dark clouds, and a light snow swirls around. Mando walks along the trail to­ward the struc­tures, and the cam­era booms up.

All of this was cap­tured in the Volume, in-cam­era and in real time. Part of the walk­way was a real, prac­ti­cal set, but the rest of the world was the vir­tual im­age on the LED screen, and the par­al­lax as the cam­era boomed up matched per­fectly with the real set. The ef­fect of this sys­tem is seam­less.

Because of the enor­mous amount of pro­cess­ing power needed to cre­ate this kind of im­agery, the full 180′ screen and ceil­ing could not be ren­dered high-res­o­lu­tion, photo-real in real time. The com­pro­mise was to en­ter the spe­cific lens used on the cam­era into the sys­tem, so that it ren­dered a photo-real, high-res­o­lu­tion im­age based on the cam­er­a’s spe­cific field of view at that given mo­ment, while the rest of the screen dis­played a lower-res­o­lu­tion im­age that was still ef­fec­tive for in­ter­ac­tive light­ing and re­flec­tions on the tal­ent, props and phys­i­cal sets. (The sim­pler poly­gon count fa­cil­i­tated faster ren­der­ing times.)

Idoine (far left) dis­cusses a shot of the Child” (aka Baby Yoda”) with di­rec­tor Rick Famuyiwa (third from left) and se­ries cre­ator/​ex­ec­u­tive pro­ducer Jon Favreau (third from right), while as­sis­tant di­rec­tor Kim Richards (second from right, stand­ing) and crewmem­bers lis­ten. Practical set de­sign was of­ten used in front of the LED screen, and was de­signed to vi­su­ally bridge the gap be­tween the real and vir­tual space. The prac­ti­cal sets were fre­quently placed on ris­ers to lift the floor and bet­ter hide the seam of the LED wall and stage floor.

Each Volume load was put into the Unreal Engine video game plat­form, which pro­vided the live, real-time, 3D en­vi­ron­ment that re­acted to the pro­duc­tion cam­er­a’s po­si­tion — which was tracked by Profile Studios’ mo­tion-cap­ture sys­tem via in­frared (IR) cam­eras sur­round­ing the top of the LED walls that mon­i­tored the IR mark­ers mounted to the pro­duc­tion cam­era. When the sys­tem rec­og­nized the X, Y, Z po­si­tion of the cam­era, it then ren­dered proper 3D par­al­lax for the cam­er­a’s po­si­tion in real time. That was fed from Profile into ILMs pro­pri­etary StageCraft soft­ware, which man­aged and recorded the in­for­ma­tion and full pro­duc­tion work­flow as it, in turn, fed the im­ages into the Unreal Engine. The im­ages were then out­put to the screens with the as­sis­tance of the Lux Machina team.

It takes 11 in­ter­linked com­put­ers to serve the im­ages to the wall. Three proces­sors are ded­i­cated to real-time ren­der­ing and four servers pro­vide three 4K im­ages seam­lessly side-by-side on the wall and one 4K im­age on the ceil­ing. That de­liv­ers an im­age size of 12,288 pix­els wide by 2,160 high on the wall and 4,096 x 2,160 on the ceil­ing. With that kind of im­agery, how­ever, the full 270 de­grees (plus mov­able back LED walls) and ceil­ing can­not be ren­dered high-res­o­lu­tion photo-real in real time. The com­pro­mise is to en­ter in the spe­cific lens used on the cam­era into the sys­tem so that it ren­ders a photo-real high-res­o­lu­tion im­age only for the cam­er­a’s spe­cific field of view at that given mo­ment while the rest of the screen dis­plays a lower-res­o­lu­tion im­age that is per­fectly ef­fec­tive for in­ter­ac­tive light­ing and re­flec­tions on the tal­ent, props and phys­i­cal sets, but of a sim­pler poly­gon count for faster ren­der­ing times.

Mando stands in a canyon on the planet Arvala. The rocks be­hind him are on the LED wall, while some prac­ti­cal rocks are placed in the mid- and fore­ground to blend the tran­si­tion. The floor of the stage is cov­ered in mud and rocks for this lo­ca­tion. On the jib is an Arri Alexa LF with a Panavision Ultra Vista anamor­phic lens.

Due to the 10-12 frames (roughly half a sec­ond) of la­tency from the time Profile’s sys­tem re­ceived cam­era-po­si­tion in­for­ma­tion to Unreal’s ren­der­ing of the new po­si­tion on the LED wall, if the cam­era moved ahead of the ren­dered frus­tum (a term defin­ing the vir­tual field of view of the cam­era) on the screen, the tran­si­tion line be­tween the high-qual­ity per­spec­tive ren­der win­dow and the lower-qual­ity main ren­der would be vis­i­ble. To avoid this, the frus­tum was pro­jected an av­er­age of 40-percent larger than the ac­tual field of view of the cam­era/​lens com­bi­na­tion, to al­low some safety mar­gin for cam­era moves. In some cases, if the lens’ field of view — and there­fore the frus­tum — was too wide, the sys­tem could not ren­der an im­age high-res enough in real time; the pro­duc­tion would then use the im­age on the LED screen sim­ply as light­ing, and com­pos­ite the im­age in post [with a green­screen added be­hind the ac­tors]. In those in­stances, the back­grounds were al­ready cre­ated, and the match was seam­less be­cause those ac­tual back­grounds had been used at the time of pho­tog­ra­phy [to light the scene].

Fortunately, says Fraser, Favreau wanted The Mandalorian to have a vi­sual aes­thetic that would match that of the orig­i­nal Star Wars. This meant a more grounded” cam­era, with slow pans and tilts, and non-ag­gres­sive cam­era moves — an aes­thetic that helped to hide the sys­tem la­tency. In ad­di­tion to us­ing some of the orig­i­nal cam­era lan­guage in Star Wars, Jon is deeply in­spired by old Westerns and samu­rai films, so he also wanted to bor­row a bit from those, es­pe­cially Westerns,” Fraser notes. The Mandalorian is, in essence, a gun­slinger, and he’s very me­thod­i­cal. This gave us a set of pa­ra­me­ters that helped de­fine the look of the show. At no point will you see an 8mm fish­eye lens in some­one’s face. That just does­n’t work within this lan­guage.

It was also of para­mount im­por­tance to me that the re­sult of this tech­nol­ogy not just be suitable for TV,’ but match that of ma­jor, high-end mo­tion pic­tures,” Fraser con­tin­ues. We had to push the bar to the point where no one would re­ally know we were us­ing new tech­nol­ogy; they would just ac­cept it as is. Amazingly, we were able to do just that.”

Steadicam op­er­a­tor Simon Jayes tracks Mando, Mayfeld (Bill Burr) and Ran Malk (Mark Boone Jr.) in front of the LED wall. While the 10- to 12-frame la­tency of ren­der­ing the high-res­o­lu­tion frustum” on the wall can be prob­lem­atic, Steadicam was em­ployed lib­er­ally in Episode 6 to great suc­cess.

Shot on Arri’s Alexa LF, The Mandalorian was the maiden voy­age for Panavision’s full-frame Ultra Vista 1.65x anamor­phic lenses. The 1.65x anamor­phic squeeze al­lowed for full uti­liza­tion of the 1.44:1 as­pect ra­tio of the LF to cre­ate a 2.37:1 na­tive as­pect ra­tio, which was only slightly cropped to 2.39:1 for ex­hi­bi­tion.

We chose the LF for a cou­ple rea­sons,” ex­plains Fraser. Star Wars has a long his­tory of anamor­phic pho­tog­ra­phy, and that as­pect ra­tio is re­ally key. We tested spher­i­cal lenses and crop­ping to 2.40, but it did­n’t feel right. It felt very con­tem­po­rary, not like the Star Wars we grew up with. Additionally, the LFs larger sen­sor changes the fo­cal length of the lens that we use for any given shot to a longer lens and re­duces the over­all depth of field. The T2.3 of the Ultra Vistas is more like a T0.8 in Super 35, so with less depth of field, it was eas­ier to put the LED screen out of fo­cus faster, which avoided a lot of is­sues with moiré. It al­lows the in­her­ent prob­lems in a 2D screen dis­play­ing 3D im­ages to fall off in fo­cus a lot faster, so the eye can’t tell that those build­ings that ap­pear to be 1,000 feet away are ac­tu­ally be­ing pro­jected on a 2D screen only 20 feet from the ac­tor.

Fraser op­er­ates an Alexa LF, shoot­ing a close-up of the Ugnaught Kuiil (Misty Rosas in the suit, voiced by Nick Nolte). The tran­si­tion be­tween the bot­tom of the LED wall and the stage floor is clearly seen here. That area was of­ten ob­scured by phys­i­cal pro­duc­tion de­sign or re­placed in post.

The Ultra Vistas were a great choice for us be­cause they have a good amount of char­ac­ter and soft­ness,” Fraser con­tin­ues. Photographing the chrome hel­met on Mando is a chal­lenge — its su­per-sharp edges can quickly look video-like if the lens is too sharp. Having a softer acu­tance in the lens, which [Panavision se­nior vice pres­i­dent of op­ti­cal en­gi­neer­ing and ASC as­so­ci­ate] Dan Sasaki [modified] for us, re­ally helped. The lens we used for Mando tended to be a lit­tle too soft for hu­man faces, so we usu­ally shot Mando wide open, com­pen­sat­ing for that with ND fil­ters, and shot peo­ple 2⁄3 stop or 1 stop closed.”

According to Idoine, the pro­duc­tion used 50mm, 65mm, 75mm, 100mm, 135mm, 150mm and 180mm Ultra Vistas that range from T2 to T2.8, and he and Fraser tended to ex­pose at T2.5-T3.5. Dan Sasaki gave us two pro­to­type Ultra Vistas to test in June 2018,” he says, and from that we worked out what fo­cal-length range to build.

Director Bryce Dallas Howard con­fers with ac­tress Gina Carano — as mer­ce­nary Cara Dune — while shoot­ing the episode Chapter 4: Sanctuary.”

Our de­sire for cin­e­matic im­agery drove every choice,” Idoine adds. And that in­cluded the in­cor­po­ra­tion of a LUT em­u­lat­ing Kodak’s short-lived 500T 5230 color neg­a­tive, a fa­vorite of Fraser’s. I used that stock on Killing Them Softly [AC Oct. 12] and Foxcatcher [AC Dec. 14], and I just loved its creamy shad­ows and the slight ma­genta cast in the high­lights,” says Fraser. For Rogue One, ILM was able to de­velop a LUT that em­u­lated it, and I’ve been us­ing that LUT ever since.”

Foxcatcher was the last film I shot on the stock, and then Kodak dis­con­tin­ued it,” con­tin­ues Fraser. At the time, we had some stock left over and I asked the pro­duc­tion if we could do­nate it to an Australian film stu­dent and they said yes,’ so we sent sev­eral boxes to Australia. When I was prep­ping Rogue One, I de­cided that was the look I wanted — this 5230 stock — but it was gone. On a long shot, I wrote an email to the film stu­dent to see if he had any stock left and, un­be­liev­ably, he had 50 feet in the bot­tom of his fridge. I had him send that di­rectly to ILM and they cre­ated a LUT from it that I used on Rogue and now Mandalorian.”

Actor Giancarlo Esposito as Moff Gideon, an Imperial search­ing for the Child.

A sig­nif­i­cant key to the Volume’s suc­cess cre­at­ing in-cam­era fi­nal VFX is color match­ing the wal­l’s LED out­put with the color ma­trix of the Arri Alexa LF cam­era. ILMs Matthias Scharfenberg, J. Schulte and their team did thor­ough test­ing of the Black Roe LED ca­pa­bil­i­ties and match­ing that with the color sen­si­tiv­ity and re­pro­duc­tion of the LF to make them seam­less part­ners. LEDs are very nar­row band color spec­trum emit­ters, their red, green and blue diodes out­put very nar­row spec­tra of color for each diode which makes reach­ing some col­ors very dif­fi­cult and fur­ther mak­ing them com­pat­i­ble with the color fil­ter ar­ray on the ALEV-III was a bit of a chal­lenge. Utilizing a care­fully de­signed se­ries of color patches, a cal­i­bra­tion se­quence was run on the LED wall to sync with the cam­er­a’s sen­si­tiv­ity. This means any other model of cam­era shoot­ing on the Volume will not re­ceive proper color, but the Alexa LF will. While the color re­pro­duc­tion of the LEDs may not have looked right to the eye, through the cam­era, it ap­peared seam­less. This means that the off-the-shelf LED pan­els won’t quite work with the ac­cu­racy nec­es­sary for a high-end pro­duc­tion, but, with cus­tom tweak­ing, they were suc­cess­ful. There were lim­i­ta­tions, how­ever. With low light back­grounds, the screens would block up and alias in the shad­ows mak­ing them un­suit­able for in-cam­era fi­nals — al­though with fur­ther de­vel­op­ment of the color sci­ence this has been solved for sea­son two.

A sig­nif­i­cant as­set to the LED Volume wall and im­ages pro­jected from it is the in­ter­ac­tive light­ing pro­vided on the ac­tors, sets and props within the Volume. The light that is pro­jected from the im­agery on the LED wall pro­vides a re­al­is­tic sense of the ac­tor (or set/​props) be­ing within that en­vi­ron­ment in a way that is rarely achiev­able with green- or blue­screen com­pos­ite pho­tog­ra­phy. If the sun is low on the hori­zon on the LED wall, the po­si­tion of the sun on the wall will be sig­nif­i­cantly brighter than the sur­round­ing sky. This brighter spot will cre­ate a bright high­light on the ac­tors and ob­jects in the Volume just as a real sun would from that po­si­tion. Reflections of el­e­ments of the en­vi­ron­ment from the walls and ceil­ing show up in Mando’s cos­tume as if he were ac­tu­ally in that real-world lo­ca­tion.

When you’re deal­ing with a re­flec­tive sub­ject like Mando, the world out­side the cam­era frame is of­ten more im­por­tant than the world you see in the cam­er­a’s field of view,” Fraser says. What’s be­hind the cam­era is re­flected in the ac­tor’s hel­met and cos­tume, and that’s cru­cial to sell­ing the il­lu­sion that he’s in that en­vi­ron­ment. Even if we were only shoot­ing in one di­rec­tion on a par­tic­u­lar lo­ca­tion, the vir­tual art-de­part­ment would have to build a 360-degree set so we could get the in­ter­ac­tive light­ing and re­flec­tions right. This was also true for prac­ti­cal sets that were built on­stage and on the back­lot — we had to build the ar­eas that we would never see on cam­era be­cause they would be re­flected in the suit. In the Volume, it’s this world out­side the cam­era that de­fines the light­ing.

When you think about it, un­less it’s a prac­ti­cal light in shot, all of our light­ing is out­side the frame — that’s how we make movies,” Fraser con­tin­ues. But when most of your light­ing comes from the en­vi­ron­ment, you have to shape that en­vi­ron­ment care­fully. We some­times have to add a prac­ti­cal or a win­dow into the de­sign, which pro­vides our key light even though we never see that [element] on cam­era.”

The fight with the mud­horn likely negated any worry about hel­met re­flec­tions for this scene.

The in­ter­ac­tive light­ing of the Volume also sig­nif­i­cantly re­duces the re­quire­ment for tra­di­tional film pro­duc­tion light­ing equip­ment and crew. The light emit­ted from the LED screens be­comes the pri­mary light­ing on the ac­tors, sets and props within the Volume. Since this light comes from a vir­tual im­age of the set or lo­ca­tion, the or­ganic na­ture of the qual­ity of the light on the el­e­ments within the Volume firmly ground those el­e­ments into the re­al­ity pre­sented.

There were, of course, lim­i­ta­tions. Although LEDs are bright and ca­pa­ble of emit­ting a good deal of light, they can­not re-cre­ate the in­ten­sity and qual­ity of di­rect, nat­ural day­light. The sun on the LED screen looks per­fect be­cause it’s been pho­tographed, but it does­n’t look good on the sub­jects — they look like they’re in a stu­dio,” Fraser at­tests. It’s work­able for close-ups, but not re­ally for wide shots. For mo­ments with real, di­rect sun­light, we headed out to the back­lot as much as pos­si­ble.” That backlot” was an open field near the Manhattan Beach Studios stages, where the art de­part­ment built var­i­ous sets. (Several stages were used for cre­at­ing tra­di­tional sets as well.)

Overcast skies, how­ever, proved a great source in the Volume. The skies for each load” — the term given for each new en­vi­ron­ment loaded onto the LED walls — were based on real, pho­tographed skies. While shoot­ing a lo­ca­tion, the pho­togram­me­try team shot mul­ti­ple stills at dif­fer­ent times of day to cre­ate sky domes.” This en­abled the di­rec­tor and cin­e­matog­ra­pher to choose the sun po­si­tion and sky qual­ity for each set. We can cre­ate a per­fect en­vi­ron­ment where you have two min­utes to sun­set frozen in time for an en­tire 10-hour day,” Idoine notes. If we need to do a turn­around, we merely ro­tate the sky and back­ground, and we’re ready to shoot!”

Idoine (seated at cam­era) in dis­cus­sion with Favreau and Filoni on a prac­ti­cal set.

During prep, Fraser and Idoine spent a lot of time in the vir­tual art de­part­ment, whose crew cre­ated the vir­tual back­grounds for the LED loads. They spent many hours go­ing through each load to set sky-dome choices and pick the per­fect time of day and sun po­si­tion for each mo­ment. They could se­lect the sky con­di­tion they wanted, ad­just the scale and the ori­en­ta­tion, and fi­nesse all of these at­trib­utes to find the best light­ing for the scene. Basic, real-time ray trac­ing helped them see the ef­fects of their choices on the vir­tual ac­tors in the pre­vis scene. These choices would then be saved and sent off to ILM, whose artists would use these rougher as­sets for ref­er­ence and build the high-res­o­lu­tion dig­i­tal as­sets.

The Virtual Art Department starts their job cre­at­ing 3-D vir­tual sets of each lo­ca­tion to pro­duc­tion de­signer Andrew Jones’ spec­i­fi­ca­tions and then the di­rec­tor and cin­e­matog­ra­pher can go into the vir­tual lo­ca­tion with VR head­sets and do a vir­tual scout. Digital ac­tors, props and sets are added and can be moved about and cov­er­age is cho­sen dur­ing the vir­tual scout. Then the cin­e­matog­ra­pher will fol­low the process as the vir­tual set gets fur­ther tex­tured with pho­togram­me­try el­e­ments and the sky domes are added.

The vir­tual world on the LED screen is fan­tas­tic for many uses, but ob­vi­ously an ac­tor can­not walk through the screen, so an open door­way does­n’t work when it’s vir­tual. Doors are an as­pect of pro­duc­tion de­sign that have to be phys­i­cal. If a char­ac­ter walks through a door, it can’t be vir­tual, it must be real as the ac­tor can’t walk through the LED screen.

Favreau gets his west­ern-style sa­loon en­trance from the first episode of The Mandalorian.

If an ac­tor is close to a set piece, it is more of­ten pre­ferred that piece be phys­i­cal in­stead of vir­tual. If they’re close to a wall, that should be a phys­i­cal wall so that they are ac­tu­ally close to some­thing real.

Many ob­jects that are phys­i­cal are also vir­tual. Even if a prop or set piece is phys­i­cally con­structed, it is scanned and in­cor­po­rated into the vir­tual world so that it be­comes not only a prac­ti­cal as­set, but a dig­i­tal one as well. Once it’s in the vir­tual world, it can be turned on or off on a par­tic­u­lar set or du­pli­cated.

We take ob­jects that the art de­part­ment have cre­ated and we em­ploy pho­togram­me­try on each item to get them into the game en­gine,” ex­plains Clint Spillers, Virtual Production Supervisor. We also keep the thing that we scanned and we put it in front of the screen and we’ve had re­mark­able suc­cess get­ting the fore­ground as­set and the dig­i­tal ob­ject to live to­gether very com­fort­ably.”

Another chal­lenge on pro­duc­tion de­sign is the con­cept that every set must be ex­e­cuted in full 360 de­grees. While in tra­di­tional film­mak­ing a pro­duc­tion de­signer may be tempted to short­cut a de­sign know­ing that the cam­era will only see a small por­tion of a par­tic­u­lar set, in this world the set that is off cam­era is just as im­por­tant as the set that is seen on cam­era.

This was a big rev­e­la­tion for us early on,” at­tests pro­duc­tion de­signer Andrew Jones. We were, ini­tially, think­ing of this tech­nol­ogy as a back­drop — like an ad­vanced trans­light or painted back­drop — that we would shoot against and hope to get in-cam­era fi­nal ef­fects. We imag­ined that we would de­sign our sets as you would on a nor­mal film: IE, the cam­era sees over here, so this is what we need to build. In early con­ver­sa­tions with DP Greig Fraser he ex­plained that the off-cam­era por­tion of the set — that might never be seen on cam­era — was just as vi­tal to the ef­fect. The whole Volume is a light box and what is be­hind the cam­era is re­flected on the ac­tor’s faces, cos­tumes, props. What’s be­hind the cam­era is ac­tu­ally the key light­ing on the tal­ent.

This con­cept rad­i­cally changed how we ap­proach the sets,” Jones con­tin­ues. Anything you put in The Volume is lit by the en­vi­ron­ment, so we have to make sure that we con­cep­tu­al­ize and con­struct the vir­tual set in its en­tirety of every lo­ca­tion in full 360. Since the ac­tor is, in essence, a chrome ball, he’s re­flect­ing what is all around him so every de­tail needs to be re­al­ized.”

They some­times used pho­togram­me­try as the ba­sis, but al­ways re­lied upon the same vi­sual-ef­fects artists who cre­ate en­vi­ron­ments for the Star Wars films to re­al­ize these real-time worlds — baking in” light­ing choices es­tab­lished ear­lier in the pipeline with high-end, ray-traced ren­der­ing.

I chose the sky domes that worked best for all the shots we needed for each se­quence on the Volume,” Fraser notes. After they were cho­sen and ILM had done their work, I could­n’t raise or lower the sun be­cause the light­ing and shad­ows would be baked in, but I could turn the whole world to ad­just where the hot spot was.”

Fraser noted a lim­i­ta­tion of the ad­just­ments that can be made to the sky domes once they’re live on the Volume af­ter ILMs fi­nal­iza­tion. The world can be ro­tated and the cen­ter po­si­tion can be changed; the in­ten­sity and color can be ad­justed, but the ac­tual po­si­tion of the sun in the sky dome can’t be al­tered be­cause ILM has done the ray trac­ing ahead of time and baked” in the shad­ows of the ter­rain by the sun po­si­tion. This is done to min­i­mize the com­pu­ta­tions nec­es­sary to do ad­vanced ray trac­ing in real time. If the cho­sen po­si­tion changes, those baked-in shad­ows won’t change, only the el­e­ments that are re­served for real-time ren­der­ing and sim­ple ray trac­ing will be af­fected. This would make the back­grounds look false and fake as the light­ing di­rec­tion would­n’t match the baked-in shad­ows.

From time to time, tra­di­tional light­ing fix­tures were added to aug­ment the out­put of the Volume.

In the fourth episode, the Mandalorian is look­ing to lay low and trav­els to the re­mote farm­ing planet of Sorgan and vis­its the com­mon house, which is a thatched, bas­ket-weave struc­ture. The ac­tual com­mon house was a minia­ture built by the art de­part­ment and then pho­tographed to be in­cluded in the vir­tual world. The minia­ture was lit with a sin­gle, hard light source that em­u­lated nat­ural day­light break­ing through the thatched walls. You could clearly see that one side of the com­mon house was in hard light and the other side was in shadow,” re­calls Idoine. There were hot spots in the model that re­ally looked great so we in­cor­po­rated LED movers” with slash go­bos and Charlie Bars [long flags] to break up the light in a sim­i­lar bas­ket-weave pat­tern. Because of this very open bas­ket-weave con­struc­tion and the fact that the load had a lot of shafts of light, I added in ran­dom slashes of hard light into the prac­ti­cal set and it mixed re­ally well.”

The Volume could in­cor­po­rate vir­tual light­ing, too, via the Brain Bar,” a NASA Mission Control-like sec­tion of the sound­stage where as many as a dozen artists from ILM, Unreal and Profile sat at work­sta­tions and made the tech­nol­ogy of the Volume func­tion. Their work was able to in­cor­po­rate on-the-fly color-cor­rec­tion ad­just­ments and vir­tual-light­ing tools, among other tweaks.

Matt Madden, pres­i­dent of Profile and a mem­ber of the Brain Bar team, worked closely with Fraser, Idoine and gaffer Jeff Webster to in­cor­po­rate vir­tual-light­ing tools via an iPad that com­mu­ni­cated back to the Bar. He could cre­ate shapes of light on the wall of any size, color and in­ten­sity. If the cin­e­matog­ra­pher wanted a large, soft source off-cam­era, Madden was able to cre­ate a light card” of white just out­side the frus­tum. The en­tire wall out­side the cam­er­a’s an­gle of view could be a large light source of any in­ten­sity or color that the LEDs could re­pro­duce.

In this case, a LED wall was made up of Roe Black Pearl BP2 screens with a max bright­ness of 1800 nits. 10.674 nits are equal to 1 foot can­dle of light. At peak bright­ness, the wall could cre­ate an in­ten­sity of about 168 foot can­dles. That’s the equiv­a­lent of an f/​8 3/4 at 800 ISO (24fps 180-degree shut­ter). While the Volume was never shot at peak full white, any light­ing cards” that were added were ca­pa­ble of out­putting this bright­ness.

Idoine dis­cov­ered that a great ad­di­tional source for Mando was a long, nar­row band of white near the top of the LED wall. This wrap­around source cre­ated a great back­light look on Mando’s hel­met,” Idoine says. Alternatively, he and Fraser could re­quest a tall, nar­row band of light on the wall that would re­flect on Mando’s full suit, sim­i­lar to the way a com­mer­cial pho­tog­ra­pher might light a wine bot­tle or a car — us­ing spec­u­lar re­flec­tions to de­fine shape.

Additionally, vir­tual black flags — mean­ing ar­eas where the LED wall were set to black — could be added wher­ever needed, and at what­ever size. The trans­parency of the black could also be ad­justed to any per­cent­age to cre­ate vir­tual nets.

...

Read the original on ascmag.com »

7 702 shares, 20 trendiness, 752 words and 6 minutes reading time

Amazon Let a Fraudster Keep My Sony a7R IV and Refunded Him $2,900

I am an am­a­teur pho­tog­ra­pher, and I’ve sold cam­eras non-pro­fes­sion­ally on Amazon for over eight years as I’ve up­graded. That trend comes to an end with my most re­cent trans­ac­tion. In December, I sold a mint-in-box Sony a7R 4, and the buyer used a com­bi­na­tion of so­cial en­gi­neer­ing and am­bi­gu­ity to not only end up with the cam­era, but also the money he paid me.

Amazon’s A-to-Z Guarantee did not pro­tect me as a seller. Based on my ex­pe­ri­ence with this trans­ac­tion, I can­not in good faith rec­om­mend sell­ing cam­eras on Amazon any­more.

Author’s Note: This is a sum­mary and my per­sonal opin­ion, and not that of my em­ployer or any­one else.

I or­dered a sec­ond Sony a7R 4 as a backup for a pho­to­shoot. My plan was to then re­sell it, as the seller fees were slightly less than the rental fees at the time. I listed it on Amazon, and it was al­most in­stantly pur­chased by a buyer from Florida. I took pho­tos of the cam­era as I pre­pared it for ship­ment, and used con­firmed & in­sured two-day FedEx. The pack­age ar­rived to the buyer on December 17th.

The buyer listed an ini­tial for his last name—that should have been a red flag. It gave him a layer of anonymity that will be rel­e­vant later.

On December 24th, I ap­par­ently ru­ined Christmas, as the buyer now claims that ac­ces­sories were miss­ing. Throughout this whole or­deal, I’ve never heard di­rectly from the buyer, in spite of nu­mer­ous email com­mu­ni­ca­tions. He never told me which product pur­chased/​pack­ag­ing” was miss­ing.

I started a claim with Amazon, show­ing the pho­to­graphic ev­i­dence of a mint-in-box a7R 4 with all ac­ces­sories. I de­nied the re­turn. The buyer of­fered no pho­to­graphic proof or other ev­i­dence that he re­ceived any­thing but a mint cam­era.

To this day, I have no idea what he claimed was missing” from the pack­age. I even in­cluded all the orig­i­nal plas­tic wrap!

After about a week of back-and-forth emails, Amazon ini­tially agreed with me.

Somehow, a sec­ond sup­port ticket for the same item got opened up. The is­sue was not yet re­solved. The buyer kept claw­ing back. The next day, I get an email about a refund re­quest ini­ti­ated.” On this sec­ond ticket, Amazon now turned against me.

Now, we’re in 2020. The buyer ap­par­ently shipped the cam­era back to me; how­ever, he en­tered the wrong ad­dress (forgetting my last name, among other things). The pack­age was re­turned to sender, and I never got to in­spect what’s in­side. Whether that box con­tained the cam­era as sent in like new con­di­tion, a used cam­era, or a box of stones is an eter­nal mys­tery.

Truly, had he shipped it to the right ad­dress, I would have had mul­ti­ple wit­nesses and video footage of the un­box­ing.

Here’s where it gets in­ter­est­ing: as I ap­peal the claim, Amazon notes that the buyer is not re­spon­si­ble for ship­ping the item back to the cor­rect ad­dress, and they can in­deed keep the item if they want to, fol­low­ing the ini­ti­a­tion of an A-to-Z Guarantee.

Indeed, I have a pa­per trail of emails that I in fact sent to Amazon. Somehow, they got their tick­ets con­fused. When I fol­lowed up on this email, they shut off com­mu­ni­ca­tion.

So, as a buyer, you can keep an item with no oblig­a­tion to re­turn,” even if you can’t sub­stan­ti­ate your claim of missing items or box.” Now the buyer has the cam­era, and the cash.

The whole ex­pe­ri­ence has been frus­trat­ing, hu­mil­i­at­ing, and sick­en­ing to my pho­tog­ra­phy hobby. I hope that this serves as a cau­tion­ary tale for sell­ing such goods on Amazon, if my ex­pe­ri­ence is any in­di­ca­tion.

As of now, I’ve emailed the buyer again to ship the cam­era back to me, and I have a case open with Amazon, in which I pro­vide the 23 emails they claim I never sent them. That case was closed with no re­sponse from Amazon. I had an ini­tially-sym­pa­thetic ear through their Twitter sup­port, un­til I men­tioned the specifics of my case.

* If you’re go­ing to sell on Amazon or else­where, take an ac­tual video of you pack­ing the cam­era. You need all the de­fense you can get against items mys­te­ri­ously dis­ap­pear­ing.

* Investigate more even-handed sell­ing ser­vices, like eBay, Fred Miranda, or other on­line re­tail­ers.

* If you need a backup cam­era, go ahead and rent one. I’m a fre­quent cus­tomer of BorrowLenses, and I in­fi­nitely re­gret not us­ing them this time.

* Update your per­sonal ar­ti­cles in­sur­ance pol­icy for any mo­ment that the cam­era is in your pos­ses­sion, and use some­thing like MyGearVault to keep track of all the se­r­ial num­bers. I only had the cam­era for a cou­ple of days al­to­gether, but that was enough.

I hope that this was a worst-case, every­thing-goes-wrong sce­nario, and I hope that it does­n’t hap­pen to any­one else. There ought to be more even-handed fail­safes for these trans­ac­tions.

About the au­thor: Cliff is an am­a­teur land­scape and travel pho­tog­ra­pher. You can find more of his work on his web­site and Instagram ac­count.

...

Read the original on petapixel.com »

8 694 shares, 26 trendiness, 1117 words and 10 minutes reading time

Radical hydrogen-boron reactor leapfrogs current nuclear fusion tech

We are side­step­ping all of the sci­en­tific chal­lenges that have held fu­sion en­ergy back for more than half a cen­tury,” says the di­rec­tor of an Australian com­pany that claims its hy­dro­gen-boron fu­sion tech­nol­ogy is al­ready work­ing a bil­lion times bet­ter than ex­pected.

HB11 Energy is a spin-out com­pany that orig­i­nated at the University of New South Wales, and it an­nounced to­day a swag of patents through Japan, China and the USA pro­tect­ing its unique ap­proach to fu­sion en­ergy gen­er­a­tion.

Fusion, of course, is the long-awaited clean, safe the­o­ret­i­cal so­lu­tion to hu­man­i­ty’s en­ergy needs. It’s how the Sun it­self makes the vast amounts of en­ergy that have pow­ered life on our planet up un­til now. Where nu­clear fis­sion — the split­ting of atoms to re­lease en­ergy — has proven in­cred­i­bly pow­er­ful but in­sanely de­struc­tive when things go wrong, fu­sion promises re­li­able, safe, low cost, green en­ergy gen­er­a­tion with no chance of ra­dioac­tive melt­down.

It’s just al­ways been 20 years away from be­ing 20 years away. A num­ber of multi-bil­lion dol­lar pro­jects are push­ing slowly for­ward, from the Max Planck Institute’s in­sanely com­plex Wendelstein 7-X steller­a­tor to the 35-nation ITER Tokamak pro­ject, and most rely on a deu­terium-tri­tium ther­monu­clear fu­sion ap­proach that re­quires the cre­ation of lu­di­crously hot tem­per­a­tures, much hot­ter than the sur­face of the Sun, at up to 15 mil­lion de­grees Celsius (27 mil­lion de­grees Fahrenheit). This is where HB11′s tech takes a sharp left turn.

The re­sults of decades of re­search by Emeritus Professor Heinrich Hora, HB11′s ap­proach to fu­sion does away with rare, ra­dioac­tive and dif­fi­cult fu­els like tri­tium al­to­gether — as well as those in­cred­i­bly high tem­per­a­tures. Instead, it uses plen­ti­ful hy­dro­gen and boron B-11, em­ploy­ing the pre­cise ap­pli­ca­tion of some very spe­cial lasers to start the fu­sion re­ac­tion.

Here’s how HB11 de­scribes its deceptively sim­ple” ap­proach: the de­sign is a largely empty metal sphere, where a mod­estly sized HB11 fuel pel­let is held in the cen­ter, with aper­tures on dif­fer­ent sides for the two lasers. One laser es­tab­lishes the mag­netic con­tain­ment field for the plasma and the sec­ond laser trig­gers the avalanche’ fu­sion chain re­ac­tion. The al­pha par­ti­cles gen­er­ated by the re­ac­tion would cre­ate an elec­tri­cal flow that can be chan­neled al­most di­rectly into an ex­ist­ing power grid with no need for a heat ex­changer or steam tur­bine gen­er­a­tor.”

HB11′s Managing Director Dr. Warren McKenzie clar­i­fies over the phone: A lot of fu­sion ex­per­i­ments are us­ing the lasers to heat things up to crazy tem­per­a­tures — we’re not. We’re us­ing the laser to mas­sively ac­cel­er­ate the hy­dro­gen through the boron sam­ple us­ing non-lin­ear forced. You could say we’re us­ing the hy­dro­gen as a dart, and hop­ing to hit a boron , and if we hit one, we can start a fu­sion re­ac­tion. That’s the essence of it. If you’ve got a sci­en­tific ap­pre­ci­a­tion of tem­per­a­ture, it’s es­sen­tially the speed of atoms mov­ing around. Creating fu­sion us­ing tem­per­a­ture is es­sen­tially ran­domly mov­ing atoms around, and hop­ing they’ll hit one an­other, our ap­proach is much more pre­cise.”

The hy­dro­gen/​boron fu­sion cre­ates a cou­ple of he­lium atoms,” he con­tin­ues. They’re naked he­li­ums, they don’t have elec­trons, so they have a pos­i­tive charge. We just have to col­lect that charge. Essentially, the lack of elec­trons is a prod­uct of the re­ac­tion and it di­rectly cre­ates the cur­rent.”

The lasers them­selves rely upon cut­ting-edge Chirped Pulse Amplification” tech­nol­ogy, the de­vel­op­ment of which won its in­ven­tors the 2018 Nobel prize in Physics. Much smaller and sim­pler than any of the high-tem­per­a­ture fu­sion gen­er­a­tors, HB11 says its gen­er­a­tors would be com­pact, clean and safe enough to build in ur­ban en­vi­ron­ments. There’s no nu­clear waste in­volved, no su­per­heated steam, and no chance of a melt­down.

This is brand new,” Professor Hora tells us. 10-petawatt power laser pulses. It’s been shown that you can cre­ate fu­sion con­di­tions with­out hun­dreds of mil­lions of de­grees. This is com­pletely new knowl­edge. I’ve been work­ing on how to ac­com­plish this for more than 40 years. It’s a unique re­sult. Now we have to con­vince the fu­sion peo­ple — it works bet­ter than the pre­sent day hun­dred mil­lion de­gree ther­mal equi­lib­rium gen­er­a­tors. We have some­thing new at hand to make a dras­tic change in the whole sit­u­a­tion. A sub­sti­tute for car­bon as our en­ergy source. A rad­i­cal new sit­u­a­tion and a new hope for en­ergy and the cli­mate.”

Indeed, says Hora, ex­per­i­ments and sim­u­la­tions on the laser-trig­gered chain re­ac­tion are re­turn­ing re­ac­tion rates a bil­lion times higher than pre­dicted. This cas­cad­ing avalanche of re­ac­tions is an es­sen­tial step to­ward the ul­ti­mate goal: reap­ing far more en­ergy from the re­ac­tion than you put in. The ex­tra­or­di­nary early re­sults lead HB11 to be­lieve the com­pany stands a high chance of reach­ing the goal of net en­ergy gain well ahead of other groups.”

As we aren’t try­ing to heat fu­els to im­pos­si­bly high tem­per­a­tures, we are side­step­ping all of the sci­en­tific chal­lenges that have held fu­sion en­ergy back for more than half a cen­tury,” says Dr McKenzie. This means our de­vel­op­ment roadmap will be much faster and cheaper than any other fu­sion ap­proach. You know what’s amaz­ing? Heinrich is in his eight­ies. He called this in the 1970s, he said this would be pos­si­ble. It’s only pos­si­ble now be­cause these brand new lasers are ca­pa­ble of do­ing it. That, in my mind, is awe­some.”

Dr McKenzie won’t how­ever, be drawn on how long it’ll be be­fore the hy­dro­gen-boron re­ac­tor is a com­mer­cial re­al­ity. The time­line ques­tion is a tricky one,” he says. I don’t want to be a laugh­ing stock by promis­ing we can de­liver some­thing in 10 years, and then not get­ting there. First step is set­ting up camp as a com­pany and get­ting started. First mile­stone is demon­strat­ing the re­ac­tions, which should be easy. Second mile­stone is get­ting enough re­ac­tions to demon­strate an en­ergy gain by count­ing the amount of he­lium that comes out of a fuel pel­let when we have those two lasers work­ing to­gether. That’ll give us all the sci­ence we need to en­gi­neer a re­ac­tor. So the third mile­stone is bring­ing that all to­gether and demon­strat­ing a re­ac­tor con­cept that works.”

This is big-time stuff. Should cheap, clean, safe fu­sion en­ergy re­ally be achieved, it would be an ex­tra­or­di­nary leap for­ward for hu­man­ity and a huge part of the an­swer for our fu­ture en­ergy needs. And should it be achieved with­out in­sanely hot tem­per­a­tures be­ing in­volved, peo­ple would be even more com­fort­able hav­ing it close to their homes. We’ll be keep­ing an eye on these guys.

...

Read the original on newatlas.com »

9 694 shares, 27 trendiness, 0 words and 0 minutes reading time

Explorable Explanations

Lion cubs play-fight to learn so­cial skills. Rats play to learn emo­tional skills. Monkeys play to learn cog­ni­tive skills. And yet, in the last cen­tury, we hu­mans have con­vinced our­selves that play is use­less, and learn­ing is sup­posed to be bor­ing.

Gosh, no won­der we’re all so mis­er­able.

Welcome to Explorable Explanations, a hub for learn­ing through play! We’re a dis­or­ga­nized movement” of artists, coders & ed­u­ca­tors who want to re­unite play and learn­ing.

Let’s get started! Check out these 3 ran­dom Explorables:

...

Read the original on explorabl.es »

10 684 shares, 27 trendiness, 3007 words and 22 minutes reading time

How to Write Usefully

February 2020

What should an es­say be? Many peo­ple would say per­sua­sive. That’s

what a lot of us were taught es­says should be. But I think we can

aim for some­thing more am­bi­tious: that an es­say should be use­ful.

To start with, that means it should be cor­rect. But it’s not enough

merely to be cor­rect. It’s easy to make a state­ment cor­rect by

mak­ing it vague. That’s a com­mon flaw in aca­d­e­mic writ­ing, for

ex­am­ple. If you know noth­ing at all about an is­sue, you can’t go

wrong by say­ing that the is­sue is a com­plex one, that there are

many fac­tors to be con­sid­ered, that it’s a mis­take to take too

sim­plis­tic a view of it, and so on.

Though no doubt cor­rect, such state­ments tell the reader noth­ing.

Useful writ­ing makes claims that are as strong as they can be made

with­out be­com­ing false.

For ex­am­ple, it’s more use­ful to say that Pike’s Peak is near the

mid­dle of Colorado than merely some­where in Colorado. But if I say

it’s in the ex­act mid­dle of Colorado, I’ve now gone too far, be­cause

it’s a bit east of the mid­dle.

Precision and cor­rect­ness are like op­pos­ing forces. It’s easy to

sat­isfy one if you ig­nore the other. The con­verse of va­porous

aca­d­e­mic writ­ing is the bold, but false, rhetoric of dem­a­gogues.

Useful writ­ing is bold, but true.

It’s also two other things: it tells peo­ple some­thing im­por­tant,

and that at least some of them did­n’t al­ready know.

Telling peo­ple some­thing they did­n’t know does­n’t al­ways mean

sur­pris­ing them. Sometimes it means telling them some­thing they

knew un­con­sciously but had never put into words. In fact those may

be the more valu­able in­sights, be­cause they tend to be more

fun­da­men­tal.

Let’s put them all to­gether. Useful writ­ing tells peo­ple some­thing

true and im­por­tant that they did­n’t al­ready know, and tells them

as un­equiv­o­cally as pos­si­ble.

Notice these are all a mat­ter of de­gree. For ex­am­ple, you can’t

ex­pect an idea to be novel to every­one. Any in­sight that you have

will prob­a­bly have al­ready been had by at least one of the world’s

7 bil­lion peo­ple. But it’s suf­fi­cient if an idea is novel to a lot

of read­ers.

Ditto for cor­rect­ness, im­por­tance, and strength. In ef­fect the four

com­po­nents are like num­bers you can mul­ti­ply to­gether to get a score

for use­ful­ness. Which I re­al­ize is al­most awk­wardly re­duc­tive, but

nonethe­less true.

How can you en­sure that the things you say are true and novel and

im­por­tant? Believe it or not, there is a trick for do­ing this. I

learned it from my friend Robert Morris, who has a hor­ror of say­ing

any­thing dumb. His trick is not to say any­thing un­less he’s sure

it’s worth hear­ing. This makes it hard to get opin­ions out of him,

but when you do, they’re usu­ally right.

Translated into es­say writ­ing, what this means is that if you write

a bad sen­tence, you don’t pub­lish it. You delete it and try again.

Often you aban­don whole branches of four or five para­graphs. Sometimes

a whole es­say.

You can’t en­sure that every idea you have is good, but you can

en­sure that every one you pub­lish is, by sim­ply not pub­lish­ing the

ones that aren’t.

In the sci­ences, this is called pub­li­ca­tion bias, and is con­sid­ered

bad. When some hy­poth­e­sis you’re ex­plor­ing gets in­con­clu­sive re­sults,

you’re sup­posed to tell peo­ple about that too. But with es­say

writ­ing, pub­li­ca­tion bias is the way to go.

My strat­egy is loose, then tight. I write the first draft of an

es­say fast, try­ing out all kinds of ideas. Then I spend days rewrit­ing

it very care­fully.

I’ve never tried to count how many times I proof­read es­says, but

I’m sure there are sen­tences I’ve read 100 times be­fore pub­lish­ing

them. When I proof­read an es­say, there are usu­ally pas­sages that

stick out in an an­noy­ing way, some­times be­cause they’re clum­sily

writ­ten, and some­times be­cause I’m not sure they’re true. The

an­noy­ance starts out un­con­scious, but af­ter the tenth read­ing or

so I’m say­ing Ugh, that part” each time I hit it. They be­come like

bri­ars that catch your sleeve as you walk past. Usually I won’t

pub­lish an es­say till they’re all gone � till I can read through

the whole thing with­out the feel­ing of any­thing catch­ing.

I’ll some­times let through a sen­tence that seems clumsy, if I can’t

think of a way to rephrase it, but I will never know­ingly let through

one that does­n’t seem cor­rect. You never have to. If a sen­tence

does­n’t seem right, all you have to do is ask why it does­n’t, and

you’ve usu­ally got the re­place­ment right there in your head.

This is where es­say­ists have an ad­van­tage over jour­nal­ists. You

don’t have a dead­line. You can work for as long on an es­say as you

need to get it right. You don’t have to pub­lish the es­say at all,

if you can’t get it right. Mistakes seem to lose courage in the

face of an en­emy with un­lim­ited re­sources. Or that’s what it feels

like. What’s re­ally go­ing on is that you have dif­fer­ent ex­pec­ta­tions

for your­self. You’re like a par­ent say­ing to a child we can sit

here all night till you eat your veg­eta­bles.” Except you’re the

child too.

I’m not say­ing no mis­take gets through. For ex­am­ple, I added con­di­tion

(c) in A Way to Detect Bias”

af­ter read­ers pointed out that I’d

omit­ted it. But in prac­tice you can catch nearly all of them.

There’s a trick for get­ting im­por­tance too. It’s like the trick I

sug­gest to young founders for get­ting startup ideas: to make some­thing

you your­self want. You can use your­self as a proxy for the reader.

The reader is not com­pletely un­like you, so if you write about

top­ics that seem im­por­tant to you, they’ll prob­a­bly seem im­por­tant

to a sig­nif­i­cant num­ber of read­ers as well.

Importance has two fac­tors. It’s the num­ber of peo­ple some­thing

mat­ters to, times how much it mat­ters to them. Which means of course

that it’s not a rec­tan­gle, but a sort of ragged comb, like a Riemann

sum.

The way to get nov­elty is to write about top­ics you’ve thought about

...

Read the original on paulgraham.com »

To add this web app to your iOS home screen tap the share button and select "Add to the Home Screen".

10HN is also available as an iOS App

If you visit 10HN only rarely, check out the the best articles from the past week.

If you like 10HN please leave feedback and share

Visit pancik.com for more.