0:00 Introduction 2:30 Build In Types 3:45 Generalities about the Build In Types 7:10 Tuple 8:43 List 9:14 Set 10:00 Dict 10:50 Counter 11:30 Default Dict 13:20 Basic Powers 14:25 Built In Powers 15:33 Iteration Powers 16:53 Example Food Inspections in Chicago 19:54 Check all possible results 20:27 Filtering Failed Inspections 21:35 Filtering most common Names in the Failed Inspections 23:26 Cleaning up the data, a little bit 26:00 Checking the most common street in the Failed Inspections 27:00 Filtering the Most Common Street in the Failed Inspections by Year 29:20 Grabbing all the things that are in the most common street in the failed inspections of 2016 30:14 Business names in the most common street in the failed inspections 31:30 Identifying the worse location 33:40 Collect all the inspection numbers made in the location by their license number 35:44 Answering, What is the most common way to fail an inspection on the location?? 41:00 Outro
I don't have a link. But I can tell you that I once used sys.intern to great effect in that regard. I had a database I was pulling millions of rows out of, and one of the fields was a short string that only had about 10 different possible values. sys.intern saved a ton of memory.
Watching Dave talk is always a treat! You can get so much done by sticking with just the builtin library. (I wonder what he uses to create the slides?)
The headers are a mystery; is that some special Unicode sauce?? The rest i can kinda imagine him redefining __repr__ on the string type to show custom things on hardcoded values
Next level of mastery, using the console as a tool for presentations... Thanks David for such a presentation.
what a BOSS! live presentation with a console. mind blown Dave.
0:00 Introduction
2:30 Build In Types
3:45 Generalities about the Build In Types
7:10 Tuple
8:43 List
9:14 Set
10:00 Dict
10:50 Counter
11:30 Default Dict
13:20 Basic Powers
14:25 Built In Powers
15:33 Iteration Powers
16:53 Example Food Inspections in Chicago
19:54 Check all possible results
20:27 Filtering Failed Inspections
21:35 Filtering most common Names in the Failed Inspections
23:26 Cleaning up the data, a little bit
26:00 Checking the most common street in the Failed Inspections
27:00 Filtering the Most Common Street in the Failed Inspections by Year
29:20 Grabbing all the things that are in the most common street in the failed inspections of 2016
30:14 Business names in the most common street in the failed inspections
31:30 Identifying the worse location
33:40 Collect all the inspection numbers made in the location by their license number
35:44 Answering, What is the most common way to fail an inspection on the location??
41:00 Outro
He is the GOAT. I'm reading his python distilled book, really simple but goes into enough depth
Is there any chance someone has a link to the talk Dave mentions about data deduping?
I don't have a link. But I can tell you that I once used sys.intern to great effect in that regard. I had a database I was pulling millions of rows out of, and one of the fields was a short string that only had about 10 different possible values. sys.intern saved a ton of memory.
Dave's a superhero
Watching Dave talk is always a treat! You can get so much done by sticking with just the builtin library. (I wonder what he uses to create the slides?)
He's using a modified Python interpreter for the presentation. Nice ascii art.
you can do it yourself, the repl is exposed as a module
The headers are a mystery; is that some special Unicode sauce?? The rest i can kinda imagine him redefining __repr__ on the string type to show custom things on hardcoded values
@@Yaxqb I assume it was done using terminal escape codes
25:39 what talk is he talking about? Anyone have a link?
items = list(csv.DictReader(open('file.csv')))
That's so dope