Artwork

Innehåll tillhandahållet av Pascal Hartig. Allt poddinnehåll inklusive avsnitt, grafik och podcastbeskrivningar laddas upp och tillhandahålls direkt av Pascal Hartig eller deras podcastplattformspartner. Om du tror att någon använder ditt upphovsrättsskyddade verk utan din tillåtelse kan du följa processen som beskrivs här https://sv.player.fm/legal.
Player FM - Podcast-app
Gå offline med appen Player FM !

67: Measuring Developer Productivity with Diff Authoring Time

37:01
 
Dela
 

Manage episode 442825862 series 2199371
Innehåll tillhandahållet av Pascal Hartig. Allt poddinnehåll inklusive avsnitt, grafik och podcastbeskrivningar laddas upp och tillhandahålls direkt av Pascal Hartig eller deras podcastplattformspartner. Om du tror att någon använder ditt upphovsrättsskyddade verk utan din tillåtelse kan du följa processen som beskrivs här https://sv.player.fm/legal.

At Meta, engineers are our biggest asset which is why we have an entire org tasked with making them as productive as possible. But how do you know if your projects for improving developer experience are actually successful? For any other product, you would run an A/B test but that requires metrics and how do you measure developer productivity? Sarita and Moritz have been working on exactly that with Diff Authoring Time which measures how long it took to submit a change to our codebase. Host Pascal talks to them about the way this is implemented, the challenges and abilities this unlocks.

Got feedback? Send it to us on Threads (https://threads.net/@metatechpod), Twitter (https://twitter.com/metatechpod), Instagram (https://instagram.com/metatechpod) and don’t forget to follow our host @passy (https://twitter.com/passy, https://mastodon.social/@passy, and https://threads.net/@passy_). Fancy working with us? Check out https://www.metacareers.com/.

Links

Timestamps

  • Episode intro 0:05

  • Sarita Intro 2:33

  • Moritz Intro 3:44

  • DevInfra as an Engineer 4:25

  • DevInfra as a Data Scientist 5:12

  • Why DevEx Metrics? 6:04

  • Average Diff Authoring Time at Meta 9:55

  • Events for calculating DAT 10:55

  • Edge cases 13:15

  • DAT for Performance Evaluation? 20:29

  • Analyses on DAT data 22:29

  • Onboarding to DAT 23:23

  • Stat-sig data 25:06

  • Validating the metric 26:34

  • Versioning metrics 28:09

  • Detecting and handling biases 29:19

  • Diff coverage 30:30

  • Do we need DevX metrics in an AI software engineering world? 31:23

  • Measuring the impact of AI tools 32:23

  • What's next for DAT? 33:40

  • Outtakes 36:22

  continue reading

72 episoder

Artwork
iconDela
 
Manage episode 442825862 series 2199371
Innehåll tillhandahållet av Pascal Hartig. Allt poddinnehåll inklusive avsnitt, grafik och podcastbeskrivningar laddas upp och tillhandahålls direkt av Pascal Hartig eller deras podcastplattformspartner. Om du tror att någon använder ditt upphovsrättsskyddade verk utan din tillåtelse kan du följa processen som beskrivs här https://sv.player.fm/legal.

At Meta, engineers are our biggest asset which is why we have an entire org tasked with making them as productive as possible. But how do you know if your projects for improving developer experience are actually successful? For any other product, you would run an A/B test but that requires metrics and how do you measure developer productivity? Sarita and Moritz have been working on exactly that with Diff Authoring Time which measures how long it took to submit a change to our codebase. Host Pascal talks to them about the way this is implemented, the challenges and abilities this unlocks.

Got feedback? Send it to us on Threads (https://threads.net/@metatechpod), Twitter (https://twitter.com/metatechpod), Instagram (https://instagram.com/metatechpod) and don’t forget to follow our host @passy (https://twitter.com/passy, https://mastodon.social/@passy, and https://threads.net/@passy_). Fancy working with us? Check out https://www.metacareers.com/.

Links

Timestamps

  • Episode intro 0:05

  • Sarita Intro 2:33

  • Moritz Intro 3:44

  • DevInfra as an Engineer 4:25

  • DevInfra as a Data Scientist 5:12

  • Why DevEx Metrics? 6:04

  • Average Diff Authoring Time at Meta 9:55

  • Events for calculating DAT 10:55

  • Edge cases 13:15

  • DAT for Performance Evaluation? 20:29

  • Analyses on DAT data 22:29

  • Onboarding to DAT 23:23

  • Stat-sig data 25:06

  • Validating the metric 26:34

  • Versioning metrics 28:09

  • Detecting and handling biases 29:19

  • Diff coverage 30:30

  • Do we need DevX metrics in an AI software engineering world? 31:23

  • Measuring the impact of AI tools 32:23

  • What's next for DAT? 33:40

  • Outtakes 36:22

  continue reading

72 episoder

Alla avsnitt

×
 
Loading …

Välkommen till Player FM

Player FM scannar webben för högkvalitativa podcasts för dig att njuta av nu direkt. Den är den bästa podcast-appen och den fungerar med Android, Iphone och webben. Bli medlem för att synka prenumerationer mellan enheter.

 

Snabbguide