mythtv finally!!

This weekend, I wanted to setup my mythtv box, I had planned already a couple of times, but almost everytime something came up. Well this weekend, I almost locked myself in and started it using the knoppmyth cd. Peace of cake you would think!! Well actually if you know what you’re doing, yes, otherwise no!!

First I had some very strange hardware issues. The computer I wanted to run it on was a used pc I bought half a year ago. Well I put in a 180 Gig hd instead of the small one that was in there first. The strangest thing happened, the hd was primary master and the cdrom was secondary master. The freakin thing wouldn’t recognise the hd. Okay unplugged the cdrom, k now it worked. Plug it back in, and same problem not recognised!!
Okay put the cdrom on slave, hd detected, but cdrom not? :s Okay put them on the same ide hd master with slave, cdrom slave (even cable select), none got detected? Finally I tried a new ide cable for my hd, the cdrom on another cdrom and the final thing I changed was I removed the plug on my hd. It was set in master with slave (there was no master only option). And finally this worked, okay already a couple hours lost 🙁

Then first install, but I remember from our lug meeting that knoppmyths be support was very poor. So I downloaded all the scripts, to bad I was running a different version then the scripts were for. The channels.sql script dropped the channels table and the whole thing blew up (hmmm, no backup of mysql made, pffff, restart).
After some hours testing and breaking it, I came to the simplest solutions thus far. Install the knoppmyth, scan for your channels. Use the guide.xml for the tv guides and then start renaming your scanned channels to the ones the tv guide has. Now my myth box is setup, I already recorded a csi miami and temptation island show 🙂
I still have to do some reading as the shows both are an hour and are 2 Gig in size. I did find a xvid option somewhere, well that’s for another time.

A funny note, my girlfriend isn’t always interested in my it stuff, but I showed here this thing and after 5 minutes I got a list of stuff I had to record for here. 😀

togetherJ

Here at work they payed quite a bit of money for Together J. This is an uml environment based on eclipse. They want it to reverse engineer some of my implementations. I already made a sequence diagram using Dia before the project started, this took me about half an our. Now that the code is finished you would expect that this would go much faster as Together J has a reverse enginering feature. Let me say that for removing,moving and unlinking of my diagram took me more than an hour thusfar and I’m not yet finished. Just wanted to do a shoutout that I really really dislike TogetherJ, this kind of tools give java the name that it is slow. While it’s actually not (okay, the startup of the vm has to be included in each app, but once started it really isn’t slow. Really, BUT don’t use it in places where you shouldn’t. Modelling in my eyes is a very graphic intensive thingy,and java isn’t there yet, maybe if they used java3D (may they even use it, don’t know the status of java3D). If you really want to do this you have 2 options imho, do it the mono way then, create a c or c++ object that can do native drawing and create some bindings for it, but then you get the problem of java not being fast going outside the jvm OR just fix the freaking drawing in java and give it a performance boost. Maybe this is already faster in the next editions of j2se, maybe it isn’t really that slow in 2d, but then it’s even worse that would mean that the people over at Borland can’t program and I find that hard to believe!!!

I’m trying to persuade my boss here to use pvanhoof his generator and build some reverse enginering features into it. Let’s hope he goes for it, I did say his code was (L)GPL so all features would have to be commited back to the code.

google docu

I saw the end of a google docu on dutch television. I searched it on an came up with this link, it doesn’t work on my ubuntu box yet, but on the site it says it only is available from monday. I hope to watch it in full, it really was interesting. It wasn’t pro google, they were more pushing to saying google is a monopolist and the EU and US should start monitoring them.

Ubuntu sound

This evening I downloaded the asterisk @ home vmware file. You just run it in your free vmware player (not opensource (yet)), point your nice firefox to the admin module and install the extra features, setup your extensions, conference rooms and off you go! Its much easier than setting it up on a xen debian, no editing in config files. You’ld prolly still want to know how his works but this asterisk @ home project has really made setting up a voip server easy!!

So I wanted to connect my ekiga to the voip but because I was already running skype I got problems with my sound. The friendly folks over at #ekiga on gimpnet point me to this wiki entry. Now I did create the .asoundrc file but when I moved it to another file everything still kept working. I rather suspect that you only have to modify the ekiga in and output, so setting them to default should have done the trick. I found out that I actually have to move the file otherwise the multisound doesn’t work!

I wanted to try the sounds together, so I wanted to fire up xmms, but hmm that’s so old school, right?! Well I’m pro-mono and I know Aaron is doing one hell of a job at the banshee project. As I understand it, it will be the first opensource (gnome)application that can ship with the mp3 codec build in. They didn’t use the one of the gpl version and therefor they could ship it. What really caught my eye was the speed at witch in loads the mp3z from my harddisk. When I clicked the same folder in rythmbox the app didn’t react, banshee show a nice progressbar and I could still move the window. I don’t know why, but it seems that the sound quality between xmms and banshee is also totally different. Banshee sounds much purer, don’t know if that souds normal, I’ve asked it on the channel but haven’t got any answers yet. The guys over at novell are realling going fast, they have produced some really amazing appz in a very short period of time (banshee,beagle,f-spot,… and diva is coming, WOOT!!) I’m a happy (mono) camper!!
The guys over at #mono doubted that this was actually possible, but hey I konw what I’m hearing.

Web standards can save your business money

As I blogged earlier, I’m reading kernel development. Now for a day job, I’m creating a web application using some java frameworks. As view layer jsp was chosen. Now I had to start from the existing application, so I cleaned the html using dreamweaver and put all the display in 1 CSS file.

Now my project is going live on monday and I’ve tested all the functionality and everything should be okay. So I started reading designing with webstandards. I’ve already read +100 pages in it. It’s really a good book and reads very nicely. I recommend it to everybody who does web development. The author made a very nice point. He was explaining the fact that if you use webstandards, you split your design in to html/xml/xhtlm and CSS. Because your html/xml/xhtml is only content and no layout it gets much smaller thus less bandwith has to be used. Your css is mostly cached by the browser so this doesn’t get send on every visit. Big deal well actually yes. Imagine you have a site that gets 100.000 request a day. You prolly have a contract with your ISP for X Gig of traffic, now imagine every page would be 80 k, that would result in 7,812 Gig of transfer okay that’s not that much but imagine your google and you would have to pay for transfer (I doubt they have to). The get millions and millions requests a day. If you could decrease your page size to 40 k. You would have half the transfer. So instead of 200 Gig you would have 100 Gig. Now that suddenly is a lot and could save your company a big sum of money.

Another good point he made was ‘ffcourse compliance and portability. If you’re using webstandards, chances are your site isn’t that bad on portable devices, screenscrapers,… So imagine in the future your boss wants to have his site available from a smart phone or pda. If you wrote your web tier web standards compliant it should work out of the box. By work I mean, all the functionality should be available. Another investment saved.

And yet another point (this time I heared in on a podcast) if you use web standards, your site should be accessible to blind people. A thing I never really thought off. But those people to use computers. Just imagine if a blind person should go to a site thats fully done in flash, its utterly useless to him. I’m not sure, but I suppose a site that has a lot of markup and isn’t a valid xhtml isn’t going to look good on his interpreter (don’t know the exact word, sorry). On the same issue they also explained that AJAX is nice, but again for the blind people it renders a page useless. I haven’t looked into this yet, it could be that the AJAX triggers and results do get displayed to the user, so I’m not going discuss it, but it surelly makes you reflect your decisions when designing a website

Kernel development

As I already mentioned a couple of times, I’m reading up on the kernel development book of R. Love. In the past I didn’t spend that much time at looking into the kernel itself. Okay I knew there were great enhancements and stuff like that. But Now I’ve finished the part about, memory allocation for user space, IO, Process scheduler and a couple of others. I have to say (again) that It has opened my eyes. I feel like taking the blue pill 😀

When I was reading up on the IO, more precise the elevator design. For those of you who may not now this design, its quite simple. Because IO is one of the slowest parts from an application the kernel has an good design for handling seeks. If a process requests data from one sector the hard disks arm will go to that position and read it. Now another process could get a timeslice and request another read somewhere totally different. Instead of moving the arm again, it waits for a couple of seconds (as I understand, the kernel doesn’t really issue the request, but maybe I’m wrong here). If there comes another request for more data, it I’ll first handle that request and then handle the next requests. I do have to admit then there are parts where my common C knowledge isn’t sufficient, but I also ordered a book for C programming. After I’ve finished it, I’ll start my other book of kernel development and then prolly reread roberts book.

So I’ve you really want be blazed of your feet read it. This book is also perfect for the programmers/admins in your department who think they know everything. Give them this book and then do a pop quiz on them, we’ll see who laughs last 😀

http://www.podzinger.com/ –> google for podcasts

Yesterday evening I was watching the latest episode of hak5. They had an interview with the guys from podzinger. Podzinger is actually a nice concept, its an online search engine for vidcasts en podcasts. From what I understood they parse podcasts and vidcasts trough an speech recognition soft that afterwards indexes the cast.

I found software enginering radio using it. That podcast seems like the idea I had in mind, so after work downloading and burning it to cd. Nice radio in the car during my comute to work 🙂

I hope some of my readers will find it as usefull as I do.