Browsed by
Category: Random Thoughts

The Social Web

The Social Web

In the past I have mentioned Facebook and related sites. Whenever I have talked about them, however, it has been in a technical capacity. I never really gave much thought to the why.

Very quickly, what could we say about Facebook on a technical note? Well, the site is– or, at the very least, appears– dead simple. Some user sticks some data in a web-based form. It is then stuck into a database for long-term storage. Later, another user wants said data so it is retrieved and displayed. Simple. There is not only nothing wrong with this but I always prefer that everything should be made as simple as possible, but no simpler. So why is Facebook so damn popular if it does not give us anything we did not already have?

The answer is that it does. It gives us something that is harder to measure: easy communication for everyone. Not just for the protocol engineers so speak Nerd, not just for the computer programmers (those handsome devils) who make software, but for everyone. Before the rise of Facebook long-distance communication was more geared towards one-on-one interaction. The telephone (later the cellular phone), e-mail, ect. These were all giant steps forwards but did not easily address “the crowd.” If you wanted to talk to several people on the phone you could need to make several phone calls. There is also a second issue with most communication methods: they happen in real-time. If I want to talk to someone on the phone they need to stop what they are doing to talk back. Real-time is a great goal for most projects but not always the best solution for all. I do not know you about you but my friends get grumpy when they need to, say, stop sleeping because I called them.

So here comes Facebook: A graffiti-tagged wall of whatever. Not only can you communicate with others but you can do it outside of normal business hours and not have the pesky are-they-available dilemmas. It is a mix between instant messaging, Internet forums, and three-way calling all in one. There are no new concepts here but great application of old ones. The why is the community. The why is the emotion.

OK, so I am late in getting my brain wrapped around this. Perhaps it is a serious short-coming of mine but now that someone got me started I am very interested.

New Uses for Old Laptops

New Uses for Old Laptops

I recently came across an old, forgotten pile of discarded laptops. Over the years, as I have replaced slower models, I never sold off or gave away my old stuff. I was wondering what I could do with such old hardware (some of which sported as low as 256MB RAM) and it hit me: servers.

Think about it. All the required hardware is built in. It has a sort of UPS already integrated (the battery), takes little power, outputs little heat, and takes up nearly no space. Plus it already has a keyboard, mouse, and display with it at all times which can neatly fold away when not needed. Stick Linux and a low-impact service– like a web server on there– and you have yourself the best paper weight ever.

I want to take this idea farther. What if I were to deploy a small farm of low-end netbooks? What if I were to have one or two load balancers? All of a sudden my low-end hardware functions as if it has a pair.

Just food for thought…

Hiding JavaScript? Maybe….

Hiding JavaScript? Maybe….

As anyone around me knows (because I will not shut up about it) I have been working on a new project. Said project relies very heavily on JavaScript and revolves around an unusual use for a web browser that I do not want to advertise just yet. Because of this I have been looking for ways to hide my HTML, CSS, and JavaScript from the client. The short answer I discovered? You can not.

Or can you? Of course if any software is going to run code it will have to have a copy in one form or another. With a scripting language the code is presumably viewable to anyone, right? With JavaScript it is viewable in the View Source option of the users browser which makes everyone from a curious hacker to your grandmother your worse enemy (I love grandma unless she steals my stuff). You can perform obfuscation on your code but that really does not fix the problem; Anyone with half a brain could decode anything the browser can decode because all the tools they would need are already in front of them. What to do, what to do?

Although it does not solve the problem completely I am considering a new project. A project that might hide virtually everything but still allow the browser to render properly. What if this method was inherently cross-platform and completely transparent to the client? What if this method not only offered a developer a lot more security but also provided an API that made web applications stateful with any unmodified, off-the-shelf web server and a lot more efficient on bandwidth?

I may soon start running experiments to test feasibility but I do not foresee any reason my idea would not work. Perhaps this could even be a marketable product…

Persistent Worlds and Their Storage

Persistent Worlds and Their Storage

Over the past few months I have been putting together an MMO-style bit of software. Since it is more of an experiment than anything else I did not start with a design plan. That is not to say that most things are not planned before hand but I have no idea what will work best so I am trying a number of things off the hip first.

Right now I am working on the basis of what will make it multiplayer. The decision I have to make now is how will the data be stored and how will the clients access it?

  • I could store everything in an SQL database. This is attractive for its persistence and accessibility across multiple platforms and languages. The down side is I can not control what is cached and what is on disk as much as I would like. Every now and again I may take a huge hit in performance as it was not designed for this task. I may hit a bottle neck much sooner in a high concurrency situation than I otherwise would.
  • I could use memcached. This is attractive for the obvious reason: blinding speed. The down side is I would have to do so much more work in code since it does not guarantee stored data would exist when I need it. This increased work could place my bottleneck on my CPU when it is already pretty high from other tasks. I would not know the full effects of this until after the project is mostly complete leaving me in a chicken or egg situation.

I am sure there are many other options. These are the two that seem the best suited for my task right now that I am aware of.

No matter what I do I will build a very light-weight abstraction layer as to switch between different designs quickly. This will save a lot of time later on so I do not have to reinvent the wheel over and over again with each test.

Crazy RAID Fun

Crazy RAID Fun

I have been experimenting with a number of things in Linux as of late. Breaking out my mad scientist cap– did I ever put it away?– one such thing I started screwing with was software-based RAID arrays in Linux. I wanted to see if I could create a number of encrypted, compressed files, spread them all over the world, and mount them in an RAID array (there is no way to say that any simpler without taking away from it). I put together a little plan in my head and I was off.

I stuck a bunch of Ubuntu installations all over different geographical locations and connected them via OpenVPN and NFS (CIFS would also have worked although probably with a noticeable performance hit). Using dd I created a 1GB file on each of them. Using losetup and its built-in encryption I mounted them all as loopback devices on my local machine. Using mdadm I turned the loopback devices into a RAID array and stuck ext4 on it. Using fusecompress I mounted said ext4 volume to compress everything on the fly.

What possible use could this serve? Well, there is the reason I did it: it is jawesome sauce and I wanted to see if I could. Those reasons, of course, are not a purpose. I suppose, if one really wanted to hide their data, they could use this in a RAID0– or a RAID5/6, even– to spread the data around using a very small cluster size. Theoretically, no two "neighbouring" clusters on the filesystem would be at the same geographical location. This means that if one– or more– sites were compromised not only would the attacker have so little of the data as to be useless but due to the way encryption tends to work (it is nearly impossible to unencrypted anything without the "neighbouring" data) it would be extra useless to them.

Like so many of my experiments this was never meant to be a practical anything. Yeah, I got it working with little difficulty– the real problem was driving around and convincing my friends to let me use their houses and bandwidth– but I can think of a number of issue that would arise in real-world use. First off, the whole concept is predicated on the use of files mounted on network exports. Files on exports potentially far, far away from you. If a network link goes down– or hiccups– you would probably have a bit of an issue. Sure it is all in a RAID but your NFS– or CIFS– mounts are going to wait a bit before they decide to let you know they have gone down. I imagine this would manifest itself as a temporary “freeze” of the filesystem but I have not tested it. A second issue is if you are using a RAID0 for maximum security (as mentioned above) losing anything at all would kill the whole setup. Consider that if your backup is not as secure as your primary then what is the point? Thirdly, depending on which RAID level you choose, you may quickly realise that not all link bandwidth or latency is created equal. I did run into a common mdadm issue where it did not release the devices but did not put any effort into fixing it.

All-in-all I am pretty excited for no good reason; I suppose I just thought it was neat. I do not recommend relying on this unless you can solve the above problems. iSCSI was designed for such things so if you are hell-bent on implementing this idea I suggest you use that instead of the loopback devices. You might have to find a new way to encrypt everything (I bet TrueCrypt would work).

I did write a few short scripts to make my lifer easier but they have no error checking so I decided not to post them. If there is enough interest I could finish them (IE add the error checking) and post them.

Working Around JavaScript Shortcomings

Working Around JavaScript Shortcomings

I am working on a real-time, JavaScript only project. I do not want to give too much away right now but I will say this: JavaScript was not designed for what I want it to do. The timers are not accurate enough and relying solely on synchronous or asynchronous communication between components simply will not work at all. What it comes down to is this is what C or C++ was meant for.

I have spend hours, today alone and not mentioning yesterday, just reading. Reading on tricks to make your own timers, how Internet Explorer on Windows or Firefox on Linux might react or whatever and how reliably. But I am determined; I am determined to make what I envision work as I envision it with nothing more than what everyones browsers already have. A friend just suggested I use Flash but Flash has way too many issues with performance and cross-compatibility (say what you want I am sticking to that). In two words? Fuck. Flash.

I have written about how anal I am in the past. Especially when it comes to things like this. I refuse to be beaten by a scripting language never-the-less a scripting language built into a God damned web browser. If I may bring my ego into this– too late– it would also be great to be “the guy” who pulled this off. The guy who people copy. The guy who starts a bunch of copycat projects.

I have learned a lot thus far. I am convinced this is very doable. It is all just going to require some research, cursing, work, and cursing. This is going to be great.

Microsoft Visual Studio C++ 2010

Microsoft Visual Studio C++ 2010

I just built myself a new Windows VM. After I spent the next ~six hours patching and rebooting several thousand times I went to install Microsoft Visual C++ Express. I had not yet had a chance to play with it before today but 2010 has been released into the wild. So I download it and start the installer. To my absolute surprise the bare-bones install is 2.4GBs. Two, point, four gigabytes. Two, point, four gigabytes. I was floored.

I have not been quiet about my great exodus not so much towards Linux but away from Windows. I used to get phone calls every week from family and friends about how their computers were slow, something stopped working out of the blue, or whatever. I have since moved most of them to 100% free software built by the community and have never hear of any problems since. As a matter of fact my father (who requires Windows-based software for work that mandated he stay with Windows) uses my mother’s Ubuntu-based laptop whenever possible because he is absolutely sick of the headaches. This is a man who does not know a mouse from a trackball and given his inexperience even he is sick of the God-damned horror show that is Microsoft Windows.

Any way, back to the point before I wrap up my short rant. I will now move all of my software development over to another operating systems and cross-compile from now on.

Hey, Microsoft. I am disgusted by you. I have a choice and I no longer choose you.

Automatic Games

Automatic Games

I have always loved games that masterba… er, play themselves out. In such games the player sets the initial conditions– perhaps even writes a little code or designs something– and then lets it all hit the fan.

As of late I have been staring at Gratuitous Space Battles. In this one you design a small fleet of space ships, complete with hulls, engines, weapons, shields, and the like, and then set them against waves of enemies. The beauty of the game is no setup will work equally well against every enemy (at least once you are past the first few levels, that is).

This has re-sparked interest in my idea for something I am currently calling Evolution Battle (yes, yes, it is a dumb name). I envision it as SimLife mixed with something similar to Gratuitous Space Battles. Players would create the “life” with its basic attributes and then stick them in the world with other “life” to compete for resources. I think it would be a great project for me since it would involve a few technical challenges I am not sure I have encountered before.

Just an honorable mention for a few of my other favorites: Crazy Machines (a near-clone of The Incredible Machine), the classic Conway’s Game of Life, Robocode, Bloons Tower Defense 3, and Lemmings (sort of).

The Future of Shopping

The Future of Shopping

I am often amazed at what other people find amazing. For example, my mother recently sent me the below video entitled “The Future of Shopping.” In this video a women is at a clothing retailer and browsing a digitised catalogue of their wares. She is able to “turn the page” by waving an arm and interact with on-screen buttons to select an item.

However cool such an interface may be– do not get me wrong, it is very, very cool– I fail to be as taken as the people around me. The reason behind this is probably the fact that I tend to view things as their individual parts rather than the subject as a whole. Take, for example, the Google Sky Map application on my phone. This application allows you to point your phone in any direction and show you what stars, constellations, planets, ect one would see if the Earth was not in their way and in the absence of bright lights. All the required technologies have not only existed for a while but everyone is familiar with them in one form or another. Google Sky Map is just a clever application of nearly static images combined with a compass and GPS; They are old horses with new tricks. A neat application but hardly as amazing as everyone seems to think that it is.

Is everyones head that far in the sand? There seems to be a huge market for companies intentionally underestimating people…

[youtube]jDi0FNcaock[/youtube]
The Future of Shopping