Wednesday, 2013-05-22

*** tpb has joined #pycon-av00:00
*** mithro has quit IRC00:50
*** skay_1 has joined #pycon-av01:16
*** skay_ has quit IRC01:16
*** skay_1 is now known as skay_01:16
*** tpb has joined #pycon-av01:26
*** parx has joined #pycon-av02:58
*** sylphiae has quit IRC03:08
*** parx has left #pycon-av04:14
*** parx has joined #pycon-av04:18
*** skay_ has quit IRC05:31
*** sylphiae has joined #pycon-av10:38
*** mithro has joined #pycon-av12:00
*** skay_ has joined #pycon-av14:24
iiieskay, CarlFK, mithro: I don't know that I count as a core stake holder for veyepar (I do like it and very occasionally contribute).  From my perspective as soon as the data can be expected to not be maintained, it shouldn't be used anymore.  So after a conference when data either disappears or becomes static either mirror/cache "forever" or record the data that we were interested in.  Mirroring is over the top, especially if we want to change it later for16:09
iiie accuracy (accurate for information, maybe present, certainly no historically what the data was).16:09
iiiePoint is that after the conference / event is over the principals (organizers, presenters / speakers, and attendees) will NOT be maintaining the data.  I could understand data-conscious (^_^) organizers updating for a week, a month (2,3,6,9,12).  Even in the most extreme case (say the conference is on data preservation) I would be amazed if the data were updated at source (api / website) a year after the event.16:13
iiieHow does syncing work today?  I thought "syncing" for veyepar was ingestion (good link skay).  The only reason to sync ever again is to get updates (additions and updates, as unless the previous grab's actions are scrapped, deletion doesn't count).16:16
CarlFK"after the conference / event is over the principals (organizers, presenters / speakers, and attendees) will NOT be maintaining the data"   Um.. not true.16:17
iiieThe data isn't versioned, so rollback isn't an option.  veyepar in general isn't versioned (this is not a criticism as it isn't a need so far).16:17
iiieWhat's the longest after an event that the data has been changed so far?  I'd like the conference / event folk to maintain data, but depending on them to seems very risky.16:20
CarlFKthe point of this thread is: if data needs to be updated, should it happen at the conference site, veyepar or youtube & pyvideo16:21
CarlFK"needs" being kinda key here...16:23
iiieThe sane option would be conference site (and api), then pulled to veyepar, and pushed to the published place.16:23
CarlFKright16:23
CarlFKI can see if the conference site was flipped from a data driven thing to a bunch of static html files16:23
iiieBut what data can even be updated?  What if the title of a talk changed?16:23
CarlFKthat goes back to does this really need to be changed ?16:24
skay_iiie:  (unschedule talks, schedule changes, canceled talks… none of that usually gets updated on the site. they might get updated in signs during an event)16:24
CarlFKusually doesn't matter :)16:25
skay_(I think the "process" you all follow should allow for fault tolerance like that)16:25
CarlFKWhen the schedule shifted Troy updated the start times of nodepdx16:25
skay_(and no freaking out and getting all nazi over people not having data and then being dicks to them)16:25
CarlFKno telling me when I can't freak out and go nazi16:26
skay_I know. I'm venting a bit16:27
CarlFKnot having an ID for node caused me pain16:27
iiieRight, fault tolerance!  Updates vary event / organizer16:27
skay_I'm not thinking about freakouts with nodepdx or troy16:27
skay_it's other things I have impressions of from before16:27
skay_iiie: no body ever got fired for assuming that people are unreliable and lazy and have other things to do16:28
skay_well, I mean the client who is like that might be fired. but then you don't get paid for doing everything for them16:28
iiiehe he he16:28
skay_buy IBM!16:28
skay_("nobody ever got fired for buying IBM")16:29
CarlFKit makes everyones (client, me, presenters) live easier if the client/presenter maintain the data and I pull copies16:30
CarlFKwhen the client agrees to that, then breaks it... I need to freak out16:30
skay_I'm nto sure you've evr had clients consciously agree to that16:30
iiiethere is a minimum set of data that must be available for the recording sheets (which in turn is the process's hard minimum).  The data has to be accurate or the sheets won't be and the process will start to break.16:31
skay_iiie: it's too fragile for handling things like a talk that changes within the hour16:32
iiieyes16:32
skay_iiie: unless you can immediately update veyepar and reprint16:32
skay_or have a tablet with webapp16:32
CarlFKwhat do you mean fragile ?16:32
skay_a talk that changes within the hour and then go bug an organizer to change their data so that you can push it veyepar so that you can reprint16:32
skay_not good16:32
skay_not enough turn around time16:32
CarlFKI need the correct title in the db at the time I encode.. which is shortly after the talk happens16:33
skay_fragile: easy for process (not just technogy, things done manually) to break down16:33
iiiebut that's a change in process.  What it would take to get to "more fault-tolerant"16:33
skay_fragile: in your case, you want to be able to immediately react to changes16:33
skay_but you have a chatty protocol that depends on an unreliable participant16:34
skay_so that the roundtrip would probably take longer than it is worth anyone's time to handle in 30 minutes16:34
skay_you need "eventual consistency"16:34
skay_to make an analogy about databases.16:34
skay_I mean,w ith16:35
skay_arg, what does CAP stand for? consistency, aaa, partiions?16:35
skay_http://en.wikipedia.org/wiki/CAP_theorem consistency availablity partiion tolerenace16:35
tpbTitle: CAP theorem - Wikipedia, the free encyclopedia (at en.wikipedia.org)16:35
iiieeventual consistency would require video replacement (re-encode and upload) for any changes that changed the encoded video.16:35
CarlFKlets take this case:   presenter is missing when it is time to talk, so 10 seconds before start time organizer grabs someone else to talk about something else.16:36
skay_iiie: yeah, so that's too hard16:36
skay_iiie: at least for now16:36
skay_iiie: I was asking carl how workable it would be to get cadre of volunteers helping with responses for reviewing videos — there's a s mall stream of emails that happen with "20  minutes in sound is weird"16:36
CarlFKI don't need the recording sheet updated - there is still the blanks to fill in start/end for that time slot and room16:36
skay_or "change the opening screen"16:36
skay_but that would depend on raw files being around and accessible to volunteers etc16:37
CarlFKsomeone needs to be responsible for a) defining what the talk title is, and b) data entry it somewhere16:37
skay_CarlFK: I think you could wingit with the talk title16:37
skay_and eventually change it if someone complains16:38
CarlFKof course I could.. but I shouldn't16:38
iiie"delay - metadata crazy" flag(s) on a talk could catch some of that (from uploading at least)16:38
skay_CarlFK: well, merging when you clobber veyepar data would be really annoying16:38
CarlFKchanging it later case  should be avoided16:38
skay_that's not true, you want to update your data from changed conference data, you claim16:39
CarlFKfor what I describe I would flag the talk as broken and not process till the new data was entered16:39
skay_which means things will get clobbered and broken in your data16:39
skay_Chris's talk isn't broken, but you are effectively holding it hostage16:39
skay_through no fault of his own16:39
CarlFK(11:38:10 AM) skay_: and eventually change it if someone complains  <- avoid that.. which implies it went public16:39
skay_you could have uploaded it the same night16:39
iiietalk title  / data entry could be easy; just a webform somewhere for submit new data for this talk (not overwriting, just an easy option to review and update)16:39
skay_1% of complaints and they will most likely be friendly?16:40
skay_I don't see the problem.16:40
skay_versus hold up 70% or all of the conference videos?16:40
CarlFKthe problems is it is time consuming to make the change16:40
skay_yeah but you are often getting paid16:40
CarlFKwho said 70% ?16:40
skay_I made up 70%16:41
CarlFK1 talk out of 10 is only 10%16:41
iiiedepending on the change we could make making the change to non-encoded videos much easier.16:41
skay_I think like 1, 2, many, lots of lots of many16:41
skay_s/70%/many16:41
skay_where many is too much16:41
* iiie just realized he has to be somewhere else16:41
skay_iiie: later!16:42
CarlFKiiie: the conference site likely has this easy data entry form you mentioned16:42
skay_me too.16:42
skay_I should be downstairs to get water16:42
CarlFKill get water for you16:42
CarlFKless you need to walk16:42
skay_no no no I will have an excuse to walk down the stairs16:42
skay_change my environment for a few minutes16:42
skay_it is fun16:42
skay_except when it isn't16:42
iiieyes conference site should have that form;  sorry to argue and run16:42
skay_but that's usually when I'm depressed16:42
skay_CarlFK: yeah if only conferences would give you all access to their data like that16:43
skay_so I think one day maybe you c ould just say… let me provide the schedule data! I will do it all!16:43
skay_then you get to be the lord and master of allt he data and metadata16:43
skay_cut them out of the annoying loop16:43
skay_and your volunteers will have google glass clones and can change the db on the fly16:43
skay_or maybe just phones where they take pictures of the hand written recording sheets and then the google glass clones upload and sync the papers to teh right record16:44
skay_ok water time16:44
skay_http://xenia.media.mit.edu/~rhodes/Papers/wear-ra-personaltech/index.html16:44
tpb<http://ln-s.net/-:CC> (at xenia.media.mit.edu)16:44
skay_The wearable remembrance agent: a system for augmented memory16:44
*** mrissa has joined #pycon-av16:58
mrissahello16:58
CarlFKhi mrissa16:59
*** mrissa has quit IRC17:05
skay_CarlFK: I updated the issue about internet archive with some helpful information I got from the emails and I also pointed out some headers that will be useful. https://github.com/CarlFK/veyepar/issues/2317:21
tpbTitle: internet archive material is uploaded in Community Texts · Issue #23 · CarlFK/veyepar · GitHub (at github.com)17:21
skay_see the last comment17:21
CarlFKneat17:27
*** fqxp has joined #pycon-av18:13
CarlFKskay - did you want to try to code this into the uploader code?18:38
CarlFKI need to create a bucket for node.. wondering if I should leave that for later18:39
CarlFKcan you take a shot at it now and I'll fix leftovers and take a shower18:40
skay_CarlFK: I'm not sure I can get anything intelligent coded for you today. I was going to do it last night but was in such a depressive funk that I couldn't18:41
skay_CarlFK: since you need a bucket for node quickly, I suggest making it by hand18:42
skay_I'll give you teh command18:42
CarlFKthat works18:42
skay_I'm not hungry for leftovers, I ended up eating pita bread and now I am full18:42
skay_but make sure you eat18:42
CarlFKcan you make me a node bucket?18:42
skay_btw, got even more helpful emails from the IA folks. also, they claim to want the dv files18:42
CarlFKpfft18:43
skay_I told them I don' tknow about the logistics for that18:43
CarlFKyeah18:43
skay_I asked if they had some place in chicago that someone could drop off harddrives at18:43
CarlFKalso, I don't see the point in the DV18:43
skay_I'm kind of curious18:43
skay_yeah but they do, and they are smart, so...18:43
skay_and they have big ideals that I abelieve in18:44
CarlFKare they smart enough to put the point in writing ?18:44
CarlFKpsf failed on that part.. so I am suspicious18:44
skay_I don't want to be in the middle of that email when it is between you and them instead of really between me and them18:44
skay_I just wanted to mention that they'd like the dv files18:45
skay_and they probably wouldn't do qustionable things like mail you a drobo18:45
CarlFKmy guess is they think there is a 1:1 between dv and what I upload18:45
skay_yeah, I was wondering that18:45
CarlFKif there was, that would make some sense18:45
skay_but I'm not sure they care if it's different18:45
skay_maybe they think the raw footage is worthwhile in itself18:46
skay_anyway, I will think about the bucket now. but if work interrupts, I will do work18:46
CarlFKk, thaks18:46
skay_you have a bucket! https://archive.org/details/nodepdx2013conference  it is 'nodepdx2013conference'19:30
tpbTitle: nodepdx2013conference : Free Download & Streaming : Internet Archive (at archive.org)19:30
skay_here is what I did:19:31
skay_    headers = {19:31
skay_            'x-archive-meta-mediatype':'movies',19:31
skay_            'x-archive-meta-collection':'opensource_movies',19:31
skay_            'x-archive-meta-year':'2013',19:31
skay_            'x-archive-meta-subject':'node.js;conference',19:31
skay_            'x-archive-meta-licenseurl':'http://creativecommons.org/licenses/by/3.0/us/',19:31
skay_            'x-archive-meta-description':'The <a href=http://nodepdx.org/>nodepdx</a> 2013 conference'19:31
skay_    }19:31
tpbTitle: NodePDX - An independent conference focused on JavaScript, held yearly in Portland, Oregon (at nodepdx.org)19:31
skay_those are the headers I used.19:31
skay_then let's say you get a connection like you normally do (you can see that in auth in archive_uploader.py)19:32
skay_let's call that var conn for convenience sake19:32
skay_conn.create_bucket('nodepdx2013conference', headers=headers)19:32
skay_voila19:32
skay_or viola19:32
skay_or double bass19:32
*** skay_ has quit IRC19:46
*** CarlFK has quit IRC19:48
*** fqxp has joined #pycon-av20:01
*** skay_ has joined #pycon-av20:33
skay_nodepdx2013conference21:19
*** CarlFK has joined #pycon-av21:22
skay_nodepdx2013conference21:22
*** fqxp has quit IRC21:46
*** skay_ has quit IRC22:03
*** parx1 has joined #pycon-av22:05
*** parx has quit IRC22:06
*** skay_ has joined #pycon-av22:20
*** parx1 is now known as parx23:07
*** parx has joined #pycon-av23:07
*** skay_ has quit IRC23:19
*** CarlFK has quit IRC23:29
*** mithro has quit IRC23:52

Generated by irclog2html.py 2.5 by Marius Gedminas - find it at mg.pov.lt!