Updates from June, 2012 Toggle Comment Threads | Keyboard Shortcuts

  • Urban 16:58 on 9 Jun. 2012 Permalink |  

    Moon 

    I still have a telescope lying around, with an unfulfilled mission to capture last wednesday’s Venus transit. So here’s a shot of the Moon, captured today, early in the morning. The quality is not great, but the craters are nicely visible near the terminator line. The aperture was stopped down using the front cover mask which seemed to mitigate some of the effects of bad seeing and average optics.

    Click for larger image.

     
  • Urban 00:12 on 13 May. 2012 Permalink |  

    Two simple solutions for server monitoring 

    I needed something very basic to monitor various personal servers that I look after. They’re scattered over multiple networks and behind many different firewalls, so monitoring systems designed for local networks (nagios, munin, etc.) are pretty much out of the question. Actually, this would be possible with something like a reverse SSH tunnel, but with machines scattered all around, this just adds another point of failure and provides no gains compared to a simple heartbeat sent as a cron job from the monitored machine to a central supervisor.

    Email notifications

    The first thing that came to mind was email. It can rely on a cloud email provider (gmail in my case), so it’s scalabe, simple to understand and simple to manage. It’s time-tested, robust and works from behind almost any firewall without problems.

    The approach I found most versatile after some trial and error was a Python script in a cron job. Python was just about the only language that comes bundled with complete SMTP/TLS functionality that’s required to connect to gmail — on virtually any platform (Windows w/Cygwin, any Linux distro, Solaris).

    Here’s the gist of the mail sending code.

    I created a new account on my Google Apps domain, hardcoded the password into the script and installed it as a cron job to send reports once a day or once a week.

    I’ve been using email already for quite some time to notify me of different finished backup jobs, but only recently have I tried this approach to monitor servers; I’ve expanded it to over 10 different machines, and let me tell you that checking ten messages a day quickly becomes a pain, since their information density is exactly zero in most cases. I needed something better.

    A dashboard

    So I decided to build a dashboard to see the status of all the servers. However,

    • I didn’t want to set it up on any of my existing severs, because that would become a single point of failure.
    • I wanted cloud, but not IaaS, where the instance is so fragile that it itself needs monitoring.
    • I wanted a free and scalable PaaS for exactly that reason: somebody manages it so I don’t have to.

    With that in mind I chose Google App engine; I’ve been reluctant to write for and deploy to App engine before because of the lock-in, but for this quick and dirty job it seemed perfect.

    I was also reluctant to share this utter hack job, because it “follows the scratch-an-itch model” and “solve[s] the problem that the hacker, himself, is having without necessarily handling related parts of the problem which would make the program more useful to others”, as someone succintly put it.

    However, just in case someone miht have a similar itch, here it is:

    It’s quite a quickfix allright: no ssl, no auth, no history purging; all problems were solved as quickly as possible, through security by obscurity and through GAE dashboard for managing database entries, and will be addressed once they actually surface. The cost of such minimal solution getting compromised is smaller than the investment needed to fix all the issues or change the URL. Use at your own risk or modify to suit your needs.

    Here’s a screenshot; it uses google static image charts to visualize disk usage and shows basically any chunk of text you compile on the clients inside a tooltip. More categories to come in the future.

     
  • Urban 23:04 on 27 Mar. 2012 Permalink |  

    Gravity Lab update: now in 3D 

    This update has long been in the making and it finally makes Gravity Lab what it was meant to be. Namely, a gravity simulator with 3D display of gravity potential. Take a look (click for larger image).

    However, a point must be made for the scientifically inclined: the “gravity well” is not completely accurate due to usability reasons. Let me explain.

    The sizes of the celestial bodies (I’m especially referring to the solar system preset) are not exactly true to scale. You see, if they were you couldn’t see them. The distances involved are so large compared to the sizes of the planets and the Sun, that they’d be sub-pixel-sized (even on iPad3 with retina screen) if the solar system was to fit the screen. You can check their true sizes with an app like Solar Walk (or simply google it).

    So I had to cheat a little and inflate the planets so we can see them. But this came back to bite me when drawing the mesh. The reason is that the gravity potential obeys different laws inside vs. outside of an object. Namely, outside of an object the gravitational pull is inversly proportional to the square of the distance from the center of mass, while inside the object it changes linearly with the distance from the center.

    So I had two choices: represent each body basically as a point mass (that would kind of resemble a black hole when zoomed out) which would produce extremely steep wells spiking out of screen and out of sight, or cheat again and try to approximate (smooth) them with a more friendly function. I chose the latter.

     
  • Urban 23:27 on 7 Feb. 2012 Permalink |  

    How not to photograph a Venus transit 

    As it happens, on June 6 2012 we’ll have another one of those rare Venus transits. This means that the Venus will pass directly in front of the Sun, similarly to an eclipse (however, Venus is so small you won’t even notice it without special equipment).

    This is the second transit in the last 8 years, and after that, we won’t get another one for over 100 years. To get a feeling, these are the showtimes:

    • 1761 & 1769
    • 1874 & 1882
    • 2004 & 2012
    • 2117 & 2125

    This year’s transit will unfortunately just be finishing when the sun rises where I live, so there won’t be much of a chance of taking pictures. However, stumbling upon that info I remembered that I actualy took pictures of the 2004 transit. Yaay!

    Now, 2004 was quite different in terms of technology. I owned an entry level digital camera (this one), poorly suited for such a task. (We’ve indeed come quite far in the last 8 years, with all the iPhones and Lytros and Angry Birds; I wonder what kind of tech they’ll have for the next one in 2117).

    Instead, I had decided to use my father’s film camera — a Praktica PLC3 with a 200mm lens. I covered it with mylar film and shot blind. Blind as in “not being able to see my results and try out different exposures on the fly.” That’s the main reason the photos suck: I overexposed them all.

    It took me some time to find the film.

    I found it last week — still in the camera, after 8 years (and as it turns out, after Kodak went belly-up).

    I’ve had it developed and scanned; all pictures were heavily overexposed (I guess the average metering threw me off; luckily, the film has broader exposure latitude than digital sensors and some details remained and could be recovered using postprocessing).

    As if that were not enough, the mylar film made a nasty halo on almost every one of them. 8 years in the camera didn’t help either, so the grain’s pretty awful (although I chose the ISO 100 for the exact reason of minimising grain).

    So this is it: two of the best pics; the first one of ingress, and the other one, somewhere in the middle.

    All in all, not that bad for film that’s been through so much. But a far cry from what I’d expected. The following picture is from Wikipedia and is awfully crisp. That’s because the original is a couple of times bigger, while my pics above are shown in actual size (scanned from film at 14 MPix).

    Note to self: next time use a telescope.

     
  • Urban 14:08 on 15 Jan. 2012 Permalink |  

    Subtitlr retires 

    The agony has gone on long enough: from an idea in 2006, to a proof of concept in mid-2007, a business plan and a hopes of a start-up (under the name of Tucana, d.o.o.) in 2008, directly to the dustbin of history.

    I’ve just pulled the plug and shut it down.

    Its ideas were good: a wikipedia-inspired revision based subtitling and subtitle translation service, which would help spread the knowledge in the form of flash-based videoclips. It’s been obsoleted by other projects with more traction, such as DotSub and TED translations (incidentally, most of the clips I was inspired by and wanted to share with people, whose first language was not English, came from the TED itself). Now that Youtube’s language recognition and Google’s machine translation have gotten much better, there’s less and less need for painstaking transcription and all the manual work.

    If I had to choose one thing to blame for its lack of success, underestimating the difficulty of transcribing video would be it. It literally takes hours to accurately transcribe a single clip, which is no longer than a couple of minutes.

    I’ve tried rebranding and repurposing it into a Funny clip subtitler and at least got some fun and local enthusiasm out of that. However, it’s all a part of one big package which now needs closure.

    Some ideas I’ve had were never implemented, although I thought they had great potential; I wanted to bring together large databases of existing movie and TV show subtitles with the publicly available video content in Flash Video. Since at the time almost all video on the web was FLV, there was no technological barrier. And there’s still a lot of popular TV shows, movies, etc, burried deeply in the video CDNs (Youtube, Megavideo, Megaupload), and large databases of “pointers” are maintained and curated by different communities (Surfthechannel.com, Icefilms.info). Having the video and the subtitle available instantly, without the cost of hosting large files, was a textbook Mash-up idea.

    I’m posting some screenshots below, for the future me, so I can remember what I was spending countless hours of my time on. Yes, the design’s ugly, but bear in mind it was all a work of one man, eager to add functionality, and pressed into kickstarting the content generation by also transcribing and translating a bunch of videos.

    Thanks to all who shared the enthusiasm and helped in any way.

    Main page

     

    Video page

     

    Subtitle translation

    Rebranded as a Hitler-parody subtitle editor

     

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel