<![CDATA[Technically Correct]]>http://arnavdhamija.com/http://arnavdhamija.com/favicon.pngTechnically Correcthttp://arnavdhamija.com/Ghost 2.1Thu, 10 Jan 2019 14:36:50 GMT60<![CDATA[My Favourite Videogames (Part 2)]]>http://arnavdhamija.com/2018/11/03/my-favourite-videogames-part-2/5c360e6fce88a669ed3a7a9cSat, 03 Nov 2018 07:32:06 GMT

One post isn't enough to do justice to the number of fantastic games I've played of late. I've played a large number of games from 2017-18 and I have three more that make the cut for the Greatest Of All Time in my books.

EarthBound (SNES)

Released 1995. Played 2017.

My Favourite Videogames (Part 2)

Released as Mother 2 for the Nintendo SNES in mid-90's Japan, EarthBound is the quirkiest game I've ever played. Despite being over two decades old, the humour still feels fresh and the game mechanics are remarkably prescient of what future RPG's would continue to use.

At its core, EarthBound is a 2D JRPG where 13-year old Ness has to the save the world from the wrath of an evil...force(?)...called Giygas. It has most of the core JRPG elements such as turn-based battles, multiple party members, inventory management, and a Health/Battle Points system. What really distinguishes EarthBound is the absolutely insane variety in the enemies. You battle absolutely everything - from a pile of puke to Blue KKK Cultists to taxicabs and Salvador Dali clocks. In a stark contrast to the brain-dead NPC dialogue in modern RPG's, EarthBound has some of the funniest lines of NPC dialogue you will ever find. The soundtrack is enjoyable and surprisingly detailed in places, especially given the 4MB size of the game. To truly appreciate EarthBound, it's best to spend some quality time playing it slowly. Every single frame of this game just oozes with charm.

My Favourite Videogames (Part 2)
Just EarthBound things

Perhaps the best thing about EarthBound is how fearlessly creative it is. Unlike contemporary videogames, EarthBound doesn't hold back on breaking new ground for using familiar elements which would appeal to a broader audience. This fearless approach did result in EarthBound's commercial failure on launch in North America, where it was puzzlingly marketed with the tagline "This Game Stinks!". Perhaps Nintendo of America was banking a little too heavily on the gross fascinations of pubescent boys to turn EarthBound into a blockbuster success.

However, EarthBound is far from a perfect game. Being old as it is, not all of its mechanics have aged well. Put mildly, the inventory management system is a downright terrible. The turn-based combat gets very tedious after a while (though there is an AUTO battle mode to somewhat fix this). Moreover, the player characters move very slowly in relation to the size of the game world. The Run button from Pokémon games is sorely missed here. In all, it's quite clear that the game was designed for an era in which there were far fewer competing distractions for one's attention. As I played the game in an emulator, I 100x'ed some of the slower parts to get through the game. Nevertheless, a single playthrough of EarthBound was enough to convince me that this is a severely under-appreciated Nintendo classic. If videogames were works of art, EarthBound would be the forerunner of the Surrealist movement. It's one of Shigesato Itoi's and Iwata's finest masterpieces.

EarthBound was a key inspiration for Toby Fox's Undertale released in 2015. At first Undertale felt like a distilled version of EarthBound, sharing the same sense of humour, but with significantly better pacing and an interesting combat system. Yet, on finishing Undertale, I was more of the opinion that it was a watered down version of EarthBound. It didn't capture my imagination the way EarthBound had.

After the release of EarthBound, Itoi and Nintendo worked on Mother 3. Mother 3 endured development hell for twelve years, being cancelled and reannounced for nearly three generations of Nintendo consoles. In the end, Mother 3 was finally released for the GBA in Japan in 2006 (two years after the release of the DS!). Unfortunately, Mother 3 still hasn't been officially translated for an international release, making the series a lost Nintendo legend. If you're interested in the story of EarthBound's development and about the game itself, AVGN's video is a great 40 minute primer:

Portal 1 (PC)

Released 2007. Played 2018.

My Favourite Videogames (Part 2)

I had a blast playing Portal 1 this January after picking it up in the Steam winter sale. It's easily the most recognisable game on this list and it needs no introduction.

Portal 1 is the epitome of good game design. There's minimal hand-holding and the levels are perfect in length and difficulty to give the player an intuitive understanding of the game mechanics before moving on to the more complicated puzzles. Instead of being ornamentally detailed, the visuals are just sufficient to convey essential information without being distracting.

There are no low points during the game and the entire experience is thoroughly enjoyable. Portal 1 demonstrates perfect pacing in a videogame. So many things in Portal 1 are just right. Given it takes only six hours to finish and is available for as little ₹30 on sale, it's an excellent addition to anyone's Steam library.

Portal 2 expanded on the formula established in Portal 1 with more complicated puzzles and new mechanics. The voice acting is even better with the snappy Wheatley and the hilariously deadpan delivery of Cave Johnson. However, the perfect pacing of Portal 1 didn't carry over to Portal 2. In several mid-game levels, the cluttered environment and confusing lighting effects made it difficult to find a white tile to open a portal. I would have much rather had the simpler lighting model of Portal 1 as it was far easier to visually parse. That said, Portal 2 still reaches the mark for a notable mention in my list.

Spelunky (PC)

Released 2011. Played 2018 - Present.

My Favourite Videogames (Part 2)

The final game on this list is also the one I've played the most this year. I was lucky enough to snag it on the Steam winter sale last year and it's easily the best ₹50 I've ever spent.

Spelunky is an insanely difficult 2D platformer with 16 randomly generated levels in each playthrough. There is no concept of saving progress in Spelunky and death is permenant. These two features make it impossible to play memorising the levels as you can in a conventional 2D platformer. Every moment of Spelunky is filled with compromise and multiple branching decisions. Picking the wrong one can send you all the way back to the beginning. As most monsters in the game have between 3-10 HP, the Spelunker isn't even much stronger than common monsters with just 4 HP, 4 Bombs, and 4 Ropes at the start of each run.

This makes death an innate part of the progression system of the game. With the lack of in-game saving, the progression is the evolution of the player in becoming better at playing Spelunky. Each death teaches the player a bit more about its mechanics. Death in Spelunky is almost always fair, implying that more often than not, death is due to your own mistakes.

The randomly generated levels give Spelunky infinite replayabilty for no two runs in Spelunky ever share the same levels. Certain patterns and obstacles may become familiar after a while but clearing an entire level requires coming up with a plan on-the-fly for each section. The best moments of the game are had in using a unfamiliar new item to sweep through the game in ways I would have never imagined.

My Favourite Videogames (Part 2)
I think I'm done for the day.

Getting better in Spelunky is the most satisfying feeling I have ever experienced in a game. An entire run of Spelunky from 1-1 to the final boss in 4-4 can be done in as little as 20 minutes, but it take just as many hours to get good enough at the game to do so.

Spelunky is also chock-filled with comic relief and dark humour. I absolutely adore the cartoonish art style which belies the incredible difficulty of the game. Nothing in the game takes itself seriously and even dying thanks to a chain reaction of events can be absolutely hysterical at at times.

You can rescue Damsels in distress for a point of health, but you also have the option of sacrificing them to the altar of Kali for a free item. Damsels are also effective projectiles and can be used for killing other enemies by throwing them. Shopkeepers, the ornery merchants of the Spelunky world, are another source of great amusement. Items from shops can either be bought or stolen. Stealing items puts you at the contempt of all the shopkeepers who will do everything to kill you for the rest of the run. It is this kind of crucial decision making which really gets at the heart of what makes Spelunky great. Mark Brown's excellent video explains the level generation system and goes into the crucial decision making process which makes Spelunky what it is:

The game is also incredibly mechanically sound. There are so many options for traversal of a level in which you need to go from the top to the bottom. Each tile of the Spelunky world can be destroyed using a well-placed bomb, making resource management an important theme of the game. Controls are also razor tight. Thanks to the low inertia of the Spelunker, playing the entire game on a keyboard is completely feasible. In fact, it's my preferred way to play Spelunky.

Not only is it super addicting to play, it's addicting to watch as well. My early forays in Spelunky were with my wingies jeering every death. I ended up becoming a far more careful player as a result. I tried to learn as much as I could about the game mechanics in every run. In some ways, this single player game brings people together more than some multiplayer games can.

Lastly, Spelunky also hosts a number of secrets under its 2D tiled world. The most significant of these is the much vaunted Hell run which requires true mastery of the game to complete. At the time of writing, I have clocked up 1500+ deaths in Spelunky, out of which I've reached Hell just five times.

I will go as far as to say that Spelunky is probably the best videogame I've ever played and I am absolutely HYPED for Spelunky 2 coming out in 2019.

]]>
<![CDATA[The Last Lecture]]>http://arnavdhamija.com/2018/09/19/the-last-lecture/5c360e6fce88a669ed3a7a9dWed, 19 Sep 2018 16:21:08 GMTI've read a lot of books in the last two months.

Page for page, I've already far surpassed my 2017 reading throughput this year. Along with my usual cache of fiction and space-flight books, I have read some beautifully written books about cancer and the process of dying such as "When Breath Becomes Air" and "Being Mortal". As much as I hope that advances in medical science will outpace mortality in the next fifty years, death is the only certainty we can contend with in life. I found that reading these books helps me understand what happens at the End Of Life. It's strangely comforting to have a deeper insight into death.

This is what brings me to The Last Lecture, a talk I stumbled upon when binging on Coding Horror.

Dr. Randy Pausch was a professor at CMU and was in the vanguard of the development of virtual reality in the 1990's. Unfortunately, he was diagnosed with pancreatic cancer in 2006. After being told that his cancer had turned terminal, he delivered his final lecture at CMU about "Achieving Your Childhood Dreams" (linked above). It's an incredibly inspirational talk and Randy Pausch is so charismatic that you can't help but to watch the entire video in a single sitting.

Despite the premise, the talk is not about cancer or dying at all. Instead, Dr. Pausch shares anecdotes from his life about he chased his own whimsical childhood dreams.

I loved many parts of this lecture, so I have copied the parts of his excellent book which resonated me the most:

On failure:

Over the years, I also made a point of telling my students that in the entertainment industry, there are countless failed products. It’s not like building houses, where every house built can be lived in by someone. A video game can be created and never make it through research and development. Or else it comes out and no one wants to play it. Yes, video-game creators who’ve had successes are greatly valued. But those who’ve had failures are valued, too—sometimes even more so.

Start-up companies often prefer to hire a chief executive with a failed start-up in his or her background. The person who failed often knows how to avoid future failures. The person who knows only success can be more oblivious to all the pitfalls.

Experience is what you get when you didn’t get what you wanted. And experience is often the most valuable thing you have to offer.

It's a cliché that experience is the best teacher. But, as Randy Pausch would have said, it's a cliché for good reason, because it's so damn true. From one of the most influential talks I've had the pleasure of attending in person, I have to come to realise that failing spectacularly is better than being a mediocre success.

Car-Accident-With-On-Home-Roof-Funny-Picture

It's a point which I try to shoehorn into every public talk I give to juniors. Sure, you might get laughed at and trampled on for failing terribly, but once you've experienced that, you will realise that failure isn't something to be all that scared about. You'll learn from the mistakes you made when you failed, and eventually, you might even learn to laugh at yourself for failing the way you did.

On brick walls - the insurmountable obstacles that stop you from getting what you want:

The brick walls are there to stop the people who don’t want it badly enough. They’re there to stop the other people.

There's a lot of meaning in those two sentences and it has really changed my perception of "not taking no for an answer". The "other people" refers to those who don't have the tenacity or the zeal to achieve their childhood dreams. The brick walls are a test for seeing if you're really sure that you want something, be it a degree, a job, or maybe a way to finance your latest hobby. I feel like a lot of challenging decisions in life can become a lot clearer after perceiving it this way. If you don't find the motivation to attempt climbing the brick wall between you and your dreams, then maybe those dreams weren't yours in the first place.

On the ultimate purpose of teachers:

In the end, educators best serve students by helping them be more self-reflective. The only way any of us can improve—as Coach Graham taught me—is if we develop a real ability to assess ourselves. If we can’t accurately do that, how can we tell if we’re getting better or worse?

Self-reflection and introspection are very underrated. It's easy to reduce education to a commercial service (only more true these days than ever) and teachers as the providers. But, education is something of a "head fake". As Randy Pausch describes it, head fakes are a type of indirect learning, often more important than the content of what is being taught. He elaborates on this by saying:

When we send our kids to play organized sports—football, soccer, swimming, whatever—for most of us, it’s not because we’re desperate for them to learn the intricacies of the sport.
What we really want them to learn is far more important: teamwork, perseverance, sportsmanship, the value of hard work, an ability to deal with adversity.

I can't remember the conditions needed for each oxidation state of Manganese, but I do remember learning how to learn and how to get through some very sticky situations. Most of the benefits I've gleaned from studying from school are less quantifiable than a number of exam results, but I wouldn't trade those life lessons for anything. It's this kind of meta education which I find sorely lacking at BITS. I miss the deeper perspectives on my subjects which made me question the way things are. Fiercely independent college students may not need the same hand-holding as naive school students, but the need for some kind of guidance is just as necessary.

On hard work:

A lot of people want a shortcut. I find the best shortcut is the long way, which is basically two words: work hard.

As I see it, if you work more hours than somebody else, during those hours you learn more about your craft. That can make you more efficient, more able, even happier. Hard work is like compounded interest in the bank. The rewards build faster.

On apologising to people:

Apologies are not pass/fail. I always told my students: When giving an apology, any performance lower than an A really doesn’t cut it.

Halfhearted or insincere apologies are often worse than not apologizing at all because recipients find them insulting. If you’ve done something wrong in your dealings with another person, it’s as if there’s an infection in your relationship. A good apology is like an antibiotic; a bad apology is like rubbing salt in the wound.

]]>
<![CDATA[My Favourite Videogames (Part 1)]]>I love videogames. I can't call myself a serious gamer, but I've played my share of good and bad games ever since I started playing Super Mario Advance 4 on a GBA SP fourteen years ago. Some games have left a deeper impression on me than others. Objectively, the games

]]>
http://arnavdhamija.com/2018/09/08/my-favourite-videogames/5c360e6fce88a669ed3a7a9bFri, 07 Sep 2018 18:42:07 GMT

I love videogames. I can't call myself a serious gamer, but I've played my share of good and bad games ever since I started playing Super Mario Advance 4 on a GBA SP fourteen years ago. Some games have left a deeper impression on me than others. Objectively, the games in this list aren't necessarily the best I have ever played, but they are the ones with which I've made the most memories.

Microsoft Flight Simulator 2004 (PC)

Released in 2003. Played from 2005 - 2013.

My Favourite Videogames (Part 1)

This is the first major PC game I've played. My dad got this for me with a Logitech Extreme 3D Pro joystick on one of his business trips. Eight year old me was blown away by the packaging of the game - it came with several booklets in a box, and the game itself was split in as many as four disks! I fondly remember being awed at the number of pre-loaded airplanes, the 3D graphics (and gorgeous virtual cockpits!), and how hellishly difficult it was to play on our vintage 1998 desktop. The poor desktop's Pentium 3 CPU could only just sputter out 5-6 fps at the lowest settings on a good day. Not to mention that the game's relatively enormous 2.5 GB installation size took up more than half of the computer's 5GB HDD. Many years later, I realised that dealing with these absurd hardware limitations is what got me into computers in the first place.

Nevertheless, my desktop's measly hardware hardly deterred the tenacious pilot in me. I enjoyed piloting small aircraft like the Piper Cub just as much as I enjoyed flying the 747-400. There was something so very satisfying in landing a plane safely with the sun creeping over the horizon at dawn. It didn't take me long to find the very active modding scene for FS2004 on websites like Simviation.com. Several add-ons are still actively developed to this day.

However, it wasn't until we got a new computer in 2008 that I really got to see the game for what it was. The new computer was more than capable of pushing out 60 fps in Ultra High quality. Scenery details, water shaders, and even the runway textures had a photo-realistic effect to them. I scoured every glitched nook and cranny of the Flight Simulator world. I mastered the art of flaring commercial aircraft before landing and taxiing it back to the terminal gate.

At this point, my add-on collection had swollen to over 10GB, nearly four times as large as the original game. There were several oddballs in my collection - including a Mitsubishi Pajero, an Asuka Cruise Ship, and a Bald Eagle which flew like a glider for all intents and purposes. My favourites among my add-on collection were the MD-9, DC-10, OpenSky 737-800, XB-70 Valkyrie, and a an amazing Concorde pack complete with dozens of different liveries. I also liked flying the Tupelov Tu-95 and the Lockheed U-2 "Dragon Lady". Alas, I couldn't shift my joystick setup after moving to college. Had I been able to, there is a good chance I would still be playing it - fifteen years after the game's initial release.

Need For Speed: Most Wanted (PC)

Released in 2005. Played from 2008 - 2011.

My Favourite Videogames (Part 1)

The only racing game in this list and my favourite overall by far. NFSMW won't win any awards for realism in its driving physics (go play DiRT if you're a stickler for realism), but it certainly will for mindless fun. There is so much that EA got right with this game - the fresh, overbright visuals, excellent world design with plenty of shortcuts and obstacles to crush unsuspecting cop cars, a good progression system, and a wicked smart cop AI to make cop chases true white-knuckle affairs.

The storyline is simple and the game has plenty of expository cutscenes to make you hate your main rival, Clarence "Razor" Callahan for stealing your brand-new BMW M3 GTR. There are a total of fifteen bosses. On beating each boss, you get a chance to "steal" their uniquely styled car through a lottery system. Winning a boss's car made it much easier to progress through the game. Each car in the game's roster can be customised in every aspect and I spent several hours tuning and buying parts for all the cars in my garage to suit my playing style.

The sheer excitement in cop chases is something I haven't found in any NFS game I've played since MW. On maxing out a car's "Heat", the cops' cars would be upgraded from the bovine Ford Crown Victoria to the absurdly fast Chevy Corvette. Being able to tap on to the police chatter was a superb game mechanic, letting one plan their way around road-blocks and spike-traps. The game also offered plenty of room for creativity to get out of tight situations in cop chases. Lastly, beating Razor at the end is right up there with the most badass moments I've ever had in a videogame.

I did play a handful of NFS games after MW, but they just fell flat on the fun aspect which MW had in droves. NFS ProStreet brought car damage and photo-realistic graphics, but the serious racing style was a huge tonal shift from the absurd shenanigans of MW. NFS Shift was cut from the same cloth as ProStreet. Hot Pursuit looked like a return to form for EA, but sadly it couldn't live up to my expectations in gameplay, which just felt clunky compared to MW. I do look forward to playing the 2012 remake of MW to see if it can replicate the charm of the first one.

Pokemon Black (DS)

Released in 2011. Played from 2011 - 2014.

My Favourite Videogames (Part 1)

Pokemon Black is not the first Pokemon game I've played, but it is the first I have played on a Nintendo console after rekindling my love for Pokemon. Coming off playing Platinum (which I had also enjoyed), Pokemon Black was a breath of fresh air. For perhaps the only time in the series' history, GameFreak took a number of risks, most of which paid off. In Black & White only brand-new Unova Pokemon are available in the wild before beating the Elite Four. This forced players to learn the game's new mechanics and adopt new battling strategies. The visuals were the best I had ever seen from a DS game with lovingly animated Pokemon sprite-work in the battles. Even the storyline was actually thought provoking and it was far better than the "Ten Year Old Beats <EVIL> Team After World Domination" narrative that had been done to death in the earlier games.

The overworld was rich and varied and the game's season mechanic changed the appearance of several areas over the course of a year. The music was fantastic in several places and a major step up from the Gen IV themes.

Being a Pokemon game, it wasn't completely free of hand-holding, but at least it was far better than Sun & Moon which came half a decade after the release of Black & White. My major gripe with BW is that Unova is much too linear and literally forces you halfway around the map to reach the Elite Four.

However, the regular gameplay was just a distraction for the wonderful Competitive Battling scene. This was the first Pokemon game in which I put my new-found knowledge of EVs and IVs to fruition. The Gen V Wi-Fi Competitive Battling scene was amazing thanks to Team Preview, BattleSpot, and International Tourneys held online every few months. Thanks to RNG manipulation and the help of an equally geeky friend, I managed to breed a full team of 6x31 IV Pokemon with ideal Natures, all obtained through completely legal means. I discovered my love of Double Battles through the International Tourneys, which I still maintain is the best way to play Pokemon. My go to team in the Tourneys was an immensely bulky Tailwind Suicune with a wall-breaking Life Orb Gengar, backed up by a Physical Fighting Gem Virizion, and a Toxic Orb Gliscor or a Sitrus Berry Gyarados for shutting down any physical attackers. This unconventional team checked a wide variety of the common suspects in Gen V Doubles, most importantly the nearly omnipresent Rotom-W.

Pokemon also gave me my first taste of online community. I joined Smogon in early 2012. My Smogon username still perpetuates in every online account I've made since then. I watched the birth of the Gen V metagame with my own eyes. Movesets, suspect tests, and analyses were uploaded and discussed at a furious pace on the Dragonspiral Tower sub-forum. Through briefly working for Smogon's C&C team and #grammar on Smogon IRC, I made some great online friends, some of whom I still speak with to this day. The community was warm and supportive. It's one of the many things I can fondly remember being a part of back in 2012.

After Pokemon Black, I played Pokemon Alpha Sapphire in late 2015. AS was a delightful blast from the past and it fully satiated my decade-long nostalgia for Hoenn. This was in no small part thanks to the post-game Delta Episode which is one of my favourite sequences of any Pokemon game.

About a year later, I picked up Pokemon Moon but I could not get into it at all. I was thoroughly underwhelmed by the constant hand-holding which severely detracted from the core experience of the game. My aging 3DS died at about the same time I finished it, marking a fitting end to my relationship with the series for the time being. Here's hoping Gen VIII on the Switch is truly a special experience which Pokemon players have been missing since Black came out.

]]>
<![CDATA[Introducing APStreamline!]]>http://arnavdhamija.com/2018/08/17/introducing-apstreamline/5c360e6fce88a669ed3a7a9aFri, 17 Aug 2018 18:11:59 GMT

Download APSync with APStreamline built in from here (BETA)!

I'm very excited to announce the release of a network adaptive video streaming solution for ArduPilot!

Links to the code

APStreamline
APWeb server
My APWeb PR

About

The APSync project currently offers a basic video streaming solution for the Raspberry Pi camera. APStreamline aims to complement this project by adding several useful features:

  • Automatic quality selection based on bandwidth and packet loss estimates

  • Selection of network interfaces to stream the video

  • Options to record the live-streamed video feed to the companion computer

  • Manual control over resolution and framerates

  • Multiple camera support using RTSP

  • Hardware-accelerated H.264 encoding for the Raspberry Pi

  • Camera settings configurable through the APWeb GUI

I'm chuffed to say that this has not only met the requirements for the GSoC project but has also covered the stretch goals I had outlined in my proposal!

Streaming video from robots is an interesting problem. There are several different use cases – you might be a marine biologist with a snazzy BlueROV equipped with several cameras or a UAV enthusiast with the itch of flying FPV. APStreamline caters to these and several other use cases.

While APStreamline works on the all network interfaces available on the Companion Computer (CC), its main value lies in the case of streaming on unreliable networks such as Wi-Fi as in most cases, we use the Companion Computer (CC) in Wi-Fi hotspot mode for streaming the video. Due to the limited range of 2.4GHz Wi-Fi, the Quality-of-Service (QoS) progressively gets worse when the robot moves further away from the receiving computer.

This project aims to fixes problem by dynamically adjusting the video quality in realtime. Over UDP we can obtain estimates of QoS using RTCP packets received from the receiver. These RTCP packets provide helpful QoS information (such as RTT and packet loss) which can be used for automatically changing the bitrate and resolution of the video delivered from the sender.

Running the Code

Hardware

All the following instructions are for installing APStreamline and APWeb on the CC. A Raspberry Pi 2/3/3B+ with the latest version of Raspian or APSync is a good choice. Intel NUC's are excellent choices as well.

Do note that the Raspberry Pi 3 and 3B+ have very low power Wi-Fi antennae which aren't great for video streaming. Using a portable Wi-Fi router like the TPLink MR-3020 can dramatically improve range. Wi-Fi USB dongles working in hotspot mode can help as well.

Installing APStreamline

Install the gstreamer dependencies:


sudo apt install libgstreamer-plugins-base1.0* libgstreamer1.0-dev libgstrtspserver-1.0-dev gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly python3-pip 

Install meson from pip and ninja for building the code:


sudo pip3 install meson 

sudo apt install ninja-build 

Navigate to your favourite folder folder and run:


git clone -b release https://github.com/shortstheory/adaptive-streaming 

meson build 

cd build 

sudo ninja install # installs to /usr/local/bin for APWeb to spawn 

On the Raspberry Pi, use sudo modprobe bcm2835-v4l2 to load the V4L2 driver for the Raspberry Pi camera. Add bcm2835-v4l2 to /etc/modules for automatically loading this module on boot.

Installing APWeb

The APWeb server project enables setting several flight controller parameters on the fly through the use of a Companion Computer (e.g. the Raspberry Pi). We use this APWeb server for configuring the video streams as well.

Clone the forked branch with APStreamline support here:


git clone -b video_streaming https://github.com/shortstheory/APWeb.git 

cd APWeb 

Install libtalloc-dev and get the MAVLink submodule:


sudo apt-get install libtalloc-dev 

git submodule init 

git submodule update 

Build APWeb:


cd APWeb 

make 

./web_server -p 80 

In case it fails to create the TCP socket, try using sudo ./web_server -p 80. This can clearly cause bad things to happen so be careful!

Usage

Video livestreams can be launched using RTSP. It is recommended to use RTSP for streaming video as it provides the advantages of supporting multiple cameras, configuring the resolution on-the-fly, and recording the livestreamed video to a file.

APWeb

Start the APWeb server. This will serve the configuration page for the RTSP stream server. Connect to the web server in your favourite web browser by going to the IP address of the Companion Computer.

On navigating to the new video/ page, you will be presented with a page to start the RTSP Server:

Introducing APStreamline!

On selecting the desired interface and starting the RTSP Server, the APWeb server will spawn the Stream Server process. The stream server will search for all the V4L2 cameras available in /dev/. It will query the capabilities of all these cameras and select hardware encoding or software encoding accordingly. The list of available cameras can be refreshed by simply stopping and starting the server.

From here, the APWeb page will display the list of available RTSP streams and their mount points:

Introducing APStreamline!

The video quality can either be automatically set based on the available network bandwidth or set manually for more fine-grained control.

The APWeb page also presents an option to record the video stream to a file on the CC. For this the video stream must be opened on the client. This works with any of the manually set resolutions but does not work with Auto quality. This is because the dynamically changing resolution causes problems with the file recording pipeline. An exception to this is the UVC cameras which can record to a file in Auto mode as well.

The RTSP streams can be viewed using any RTSP player. VLC is a good choice. Some GCS such as QGroundControl and Mission Planner can also stream video over RTSP.

For example, this can be done by going to "Media > Open Network Stream" and pasting in the RTSP Mount Point for the camera displayed in the APWeb configuration page. However, VLC introduces two seconds of latency for the jitter reduction, making it unsuitable for critical applications. To circumvent this, RTSP streams can also be viewed at lower latency by using the gst-launch command:

gst-launch-1.0 playbin uri=<RTSP-MOUNT-POINT> latency=100

As an example RTSP Mount Point looks like: rtsp://192.168.0.17:8554/cam0. Refer to the APWeb page to see the mount points given for your camera.

Standalone

Launch the RTSP stream server by running:

stream_server <interface>

The list of available network interfaces can be found by running ifconfig. Streams can be connected to using gst-launch.

Things to Try Out

  • Use a variety of different cameras and stream several at the same time

  • Try recording the live-streamed video to a file on the CC

  • Play with the Auto quality streaming feature on different types of network interfaces

]]>
<![CDATA[GSoC 2018 - Batteries Included!]]>Much time has passed and much code has been written since my last update. Adaptive Streaming (a better name TBD) for Ardupilot is nearly complete and brings a whole slew of features useful for streaming video from cameras on robots to laptops, phones, and tablets:

  • Automatic quality selection based on
]]>
http://arnavdhamija.com/2018/07/22/gsoc-2018-batteries-included/5c360e6fce88a669ed3a7a98Sun, 22 Jul 2018 17:00:00 GMT

Much time has passed and much code has been written since my last update. Adaptive Streaming (a better name TBD) for Ardupilot is nearly complete and brings a whole slew of features useful for streaming video from cameras on robots to laptops, phones, and tablets:

  • Automatic quality selection based on bandwidth and packet loss estimates
  • Options to record the live-streamed video feed to the companion computer (experimental!)
  • Fine tuned control over resolution and framerates
  • Multiple camera support over RTSP
  • Hardware-accelerated H.264 encoding for supported cameras and GPUs
  • Camera settings configurable through the APWeb GUI

Phew!

The configuration required to get everything working is minimal once the required dependencies have been installed. This is in no small part possible thanks to the GStreamer API which took care of several low level complexities of live streaming video over the air.

Streaming video from aerial robots is probably the most difficult use case of Adaptive Streaming as the WiFi link is very flaky at these high speeds and distances. I've optimised the project around my testing with video streaming from quadcopters so the benefits are passed on to streaming from other robots as well.

Algorithm

I've used a simplification of TCP's congestion control algorithm for Adaptive Streaming. I had looked at other interesting approaches including estimating receiver buffer occupancy, but using this approach didn't yield significantly better results. TCP's congestion control algorithm avoids packet loss by mandating ACKs for each successfully delivered packet and steadily increasing sender bandwidth till it reaches a dynamically set threshold.

A crucial difference for Adaptive Streaming is that 1) we stream over UDP for lower overhead (so no automatic TCP ACKs here!) 2) H.264 live video streaming is fairly loss tolerant so it's okay to lose some packets instead of re-transmitting them.

Video packets are streamed over dedicated RTP packets and Quality of Service (QoS) reports are sent over RTCP packets. These QoS reports give us all sorts of useful information, but we're mostly interested in seeing the number of packets loss between RTCP transmissions.

On receiving a RTCP packet indicating any packet loss, we immediately shift to a Congested State (better name pending) which significantly reduces the rate at which the video streaming bandwidth is increased on receiving a lossless RTCP packet. The encoding H.264 encoding bitrate is limited to no higher than 1000kbps in this state.

Once we've received five lossless RTCP packets, we shift to a Steady State which can encode upto 6000kbps. In this state we also increase the encoding bitrate at a faster rate than we do in the Congested State. A nifty part of dynamically changing H.264 bitrates is that we can also seamlessly switch the streamed resolution according to the available bandwidth, just like YouTube does with DASH!

This algorithm is fairly simple and wasn't too difficult to implement once I had figured out all the GStreamer plumbing for extracting packets from buffers. With more testing, I would like to add long-term bitrate adaptations for the bitrate selection algorithm.

H.264 Encoding

This is where we get into the complicated and wonderful world of video compression algorithms.

Compression algorithms are used in all kinds of media, such as JPEG for still images and MP3 for audio. H.264 is one of several compression algorithms available for video. H.264 takes advantage of the fact that a lot of the information in video between frames is redundant, so instead of saving 30 frames for 1 second of 30fps video, it saves one entire frame (known as the Key Frame or I-Frame) of video and computes and stores only the differences in frames with respect to the keyframe for the subsequent 29 frames. H.264 also applies some logic to predict future frames to further reduce the file size.

This is by no means close to a complete explanation of how H.264 works, for further reading I suggest checking out Sid Bala's explanation on the topic.

The legendary Tom Scott also has a fun video explaining how H.264 is adversely affected by snow and confetti!

The frequency of capturing keyframes can be set by changing the encoder parameters. In the context of live video streaming over unstable links such as WiFi, this is very important as packet loss can cause keyframes to be dropped. Dropped keyframes severely impact the quality of the video until a new keyframe is received. This is because all the frames transmitted after the keyframe only store the differences with respect to the keyframe and do not actually store a full picture on their own.

Increasing the keyframe interval means we send a large, full frame less frequently, but also means we would suffer from terrible video quality for an extended period of time on losing a keyframe. On the other hand, shorter keyframe intervals can lead to a wastage of bandwidth.

I found that a keyframe interval of every 10 frames worked much better than the default interval of 60 frames without impacting bandwidth usage too significantly.

Lastly, H.264 video encoding is a very computationally expensive algorithm. Software-based implementations of H.264 such as x264enc are well supported with several configurable parameters but have prohibitively high CPU requirements, making it all but impossible to livestream H.264 encoded video from low power embedded systems. Fortunately, the Raspberry Pi's Broadcom BCM2837 SoC has a dedicated H.264 hardware encoder pipeline for the Raspberry Pi camera which drastically reduces the CPU load in high definition H.264 encoding. Some webcams such as the Logitech C920 and higher have onboard H.264 hardware encoding thanks to special ASIC's dedicated for this purpose.

Adaptive Streaming probes for the type of encoding supported by the webcam and whether it has the IOCTL's required for changing the encoding parameters on-the-fly.

H.264 has been superseded by the more efficient H.265 encoding algorithm, but the CPU requirements for H.265 are even higher and it doesn't enjoy the same hardware support as H.264 does for the time being.

GUI

The project is soon-to-be integrated with the APWeb project for configuring companion computers. Adaptive Streaming works by creating an RTSP Streaming server running as a daemon process. The APWeb process connects to this daemon service over a local socket to populate the list of cameras, RTSP mount points, and available resolutions of each camera.

GSoC 2018 - Batteries Included!

The GUI is open for improvement and I would love feedback on how to make it easier to use!

Once the RTSP mount points are generated, one can livestream the video feed by entering in the RTSP URL of the camera into VLC. This works on all devices supporting VLC. However, VLC does add two seconds of latency to the livestream for reducing the jitter. I wasn't able to find a way to configure this in VLC, so an alternative way to get a lower latency stream is by using the following gst-launch command in a terminal:

gst-launch-1.0 playbin uri=<RTSP Mount Point> latency=100

In the scope of the GSoC timeline, I'm looking to wind down the project by working on documentation, testing, and reducing the cruft from the codebase. I'm looking forward to integrating this with companion repository soon!

Links to the code

https://github.com/shortstheory/adaptive-streaming

https://github.com/shortstheory/APWeb

Title image from OscarLiang

]]>
<![CDATA[GSoC 2018 - New Beginnings]]>I'm really excited to say that I'll be working with Ardupilot for the better part of the next two months! Although this is the second time I'm making a foray into Open Source Development, the project at hand this time is quite different from what I had worked on in

]]>
http://arnavdhamija.com/2018/06/05/gsoc-2018-new-beginnings/5c360e6fce88a669ed3a7a97Tue, 05 Jun 2018 13:00:00 GMT

I'm really excited to say that I'll be working with Ardupilot for the better part of the next two months! Although this is the second time I'm making a foray into Open Source Development, the project at hand this time is quite different from what I had worked on in my first GSoC project.

Ardupilot is an open-source autopilot software for several types of semi-autonomous robotic vehicles including multicopters, fixed-wing aircraft, and even marine vehicles such as boats and submarines. As the name suggests, Ardupilot was formerly based on the Arduino platform with the APM2.x flight controllers which boasted an ATMega2560 processor. Since then, Ardupilot has moved on to officially supporting much more advanced boards with significantly better processors and more robust hardware stacks. That said, these flight controllers contain application specific embedded hardware which is unsuitable for performing intensive tasks such as real-time object detection or video processing.

GSoC 2018 - New Beginnings

CC Setup with a Flight Computer

APSync is a recent Ardupilot project which aims to ameliorate the limited processing capability of the flight controllers by augmenting them with so-called companion computers (CCs). As of writing, APSync officially supports the Raspberry Pi 3B(+) and the NVidia Jetson line of embedded systems. One of the more popular use cases for APSync is to enable out-of-the-box live video streaming from a vehicle to a laptop. This works by using the CC's onboard WiFi chip as a WiFi hotspot to stream the video using GStreamer. However, the current implementation has some shortcomings which are:

  • Only one video output can be unicasted from the vehicle
  • The livestreamed video progressively deteriorates as the WiFi link between the laptop and the CC becomes weaker

This is where my GSoC project comes in. My project is to tackle the above issues to provide a good streaming experience from an Ardupilot vehicle. The former problem entails rewriting the video streaming code to allow for sending multiple video streams at the same time. The latter is quite a bit more interesting and it deals with several computer networks and hardware related engineering issues to solve. "Solve" is a subjective term here as there isn't any way to significantly boost the WiFi range from the CC's WiFi hotspot without some messy hardware modifications.

What can be done is to degrade the video quality as gracefully as possible. It's much better to have a smooth video stream of low quality than to have a high quality video stream laden with jitter and latency. At the same time, targeting to only stream low quality video when the WiFi link and the processor of the CC allows for better quality is inefficient. To "solve" this, we would need some kind of dynamically adaptive streaming mechanism which can change the quality of the video streamed according to the strength of the WiFi connection.

My first thought was to use something along the lines of Youtube's DASH (Dynamically Adaptive Streaming over HTTP) protocol which automatically scales the video quality according to the available bandwidth. However, DASH works in a fundamentally different way from what is required for adaptive livestreaming. DASH relies on having the same video pre-encoded in several different resolutions and bitrates. The server estimates the bandwidth of its connection to the client. On doing so, the server chooses one of the pre-encoded video chunks to send to the client. Typically, the server tries to guess which video chunk can deliver the best possible quality without buffering.

Youtube's powerful servers have no trouble encoding a video several times, but this approach is far too intensive to be carried out on a rather anemic Raspberry Pi. Furthermore, DASH relies on QoS (short for Quality of Service which includes parameters like bitrate, jitter, packet loss, etc) reports using TCP ACK messages. This causes more issues as we need to stream the video down using RTP over UDP instead of TCP. The main draw of UDP for livestreaming is that performs better than TCP does on low bandwidth connections due to its smaller overhead. Unlike TCP which places guarantees on message delivery through ACKs, UDP is purely best effort and has no concept of ACKs at the transport layer. This means we would need some kind of ACK mechanism at the application layer to measure the QoS.

Enter RTCP. This is the official sibling protocol to RTP which among other things, reports packet loss, cumulative sequence number received, and jitter. In other words - it's everything but the kitchen sink for QoS reports for multimedia over UDP! What's more, GStreamer natively integrates RTCP report handling. This is the approach I'll be using for getting estimated bandwidth reports from each receiver.

I'll be sharing my experiences with the H.264 video encoders and hardware in my next post.

]]>
<![CDATA[Be The Opposite]]>"The Opposite" (S05E21) was a watershed episode for Seinfeld. Not only did the series' direction change hands from Tom Cherones to Andy Ackerman after it, but the plots also became more focussed on the personality flaws in the characters rather than the bizarre situations they got themselves into.

]]>
http://arnavdhamija.com/2017/12/28/be-the-opposite/5c360e6fce88a669ed3a7a96Thu, 28 Dec 2017 04:59:00 GMT"The Opposite" (S05E21) was a watershed episode for Seinfeld. Not only did the series' direction change hands from Tom Cherones to Andy Ackerman after it, but the plots also became more focussed on the personality flaws in the characters rather than the bizarre situations they got themselves into. "The Opposite" was probably the only episode in which these two tropes converged, making for something quite unlike any other Seinfeld episode.

But that's not what I'm going to write about in this blog post. "The Opposite" has personal meaning for other reasons. As a short summary of the A-plot of this episode - George, a stout, balding, middle-aged man with no prospects with either a career or with the opposite sex, decides to act contrary to every instinct he's ever had. In a span of less than a day after doing so, he ends up with a new partner, a job with the New York Yankees, and finally gets the means to move out from his much beleaguered parents' house.

Though contrived for comical purposes, this episode holds something very profound under its veil of light-hearted comedy. The series goes to show that even when it's painfully obvious that you're clearly doing something wrong, it can be impossible to get out of the self-induced loop of listening to your instincts.

Jerry also drives this home with an understatedly profound remark in this episode, "If every instinct you have is wrong...then the opposite would have to be right!".

Perhaps it's a primal urge to follow your instincts, after all, your instincts are a product of your past experiences which have kept you alive. On the other hand, I feel a lot of it also has to do with getting into the comfort zone of being able to challenge your own perceptions. When you've been doing something wrong unconsciously, it becomes much harder to accept the fact that you were doing it wrong later on. Your brain accustoms itself to behaving in a certain way to a stimulus and left unchecked, this becomes a part of your personality. This is why I think shifting out of self-harmful behavior is so hard.

I've been a victim of falling to the trap of my own instincts ever since I joined college. Through a lot of time spent brooding and many moments of soul-searching through my daily diary entries, I was exactly aware of what I was unhappy with in life and what I had to do to fix it. Most of my problems could be traced down to a lack of self-esteem, but I was more than willing to wallow in pity than do anything about it. It was only a few months ago I decided to improve things by increasing physical activity and social exposure. There were many times I had a familiar nagging of doubt, but this too subsided as soon as I saw things can actually get better by not listening to yourself.

There is a poignant quote I've saved from Anne Lammot's Bird by Bird about not listening to the radio station in your head - the radio station KFKD, or K-Fucked:

"If you are not careful, station KFKD will play in your head twenty-four hours a day, nonstop, in stereo. Out of the right speaker in your inner ear will come the endless stream of self-aggrandizement, the recitation of one’s specialness, of how much more open and gifted and brilliant and knowing and misunderstood and humble one is. Out of the left speaker will be the rap songs of self-loathing, the lists of all the things one doesn’t do well, of all the mistakes one has made today and over an entire lifetime, the doubt, the assertion that everything that one touches turns to shit, that one doesn’t do relationships well, that one is in every way a fraud, incapable of selfless love, that one has no talent or insight, and on and on and on. You might as well have heavy-metal music piped in through headphones while you’re trying to get your work done."

Maybe the best way out lies in our mistakes, or rather, the opposite of them.

george

]]>
<![CDATA[ThinkPad 13: Thoughts after 18+ months]]>It's been over a year since I switched laptops and I want to share my thoughts on my current one - the Lenovo ThinkPad 13 Gen 1. Before getting this laptop, I had been using a Lenovo IdeaPad G580 which I had gotten back in 2013. While that laptop suited

]]>
http://arnavdhamija.com/2017/12/10/thinkpad-13-thoughts-after-18-months/5c360e6fce88a669ed3a7a95Sun, 10 Dec 2017 04:59:00 GMTIt's been over a year since I switched laptops and I want to share my thoughts on my current one - the Lenovo ThinkPad 13 Gen 1. Before getting this laptop, I had been using a Lenovo IdeaPad G580 which I had gotten back in 2013. While that laptop suited my needs at the time, the unwieldy 15.6" display, sub-standard 2 hrs battery life, and weight of 2.8kg just wasn't cutting it for college. I decided to go for a smaller laptop with good battery life to replace it. I was pretty satisfied with the performance of the IdeaPad so anything as good or better than its i5-3230M was enough for me.

The first laptop which I looked at was the Asus UX305UA which ticked most of my boxes. However, some reviews had stated that its build quality was questionable so I decided to hold out for a better laptop. I later stumbled upon the Lenovo ThinkPad T460(s) which looked perfect for my needs but was just a touch out of my budget. While I was considering this, news began to surface about a 13.3" ThinkPad releasing soon, later to be known as the ThinkPad 13.

It wasn't an easy decision between the T460s and the 13. The T460s was definitely a more charming laptop but there were just too many drawbacks for it to be a serious contender. I found several negative reviews regarding the battery life and the 1080p display lottery. The ThinkPad 13 wasn't without issues of its own, but there wasn't a single review that hadn't sung praises about its keyboard, battery life, and overall value proposition. Plus it came with a USB Type-C port, dual upgradeable RAM slots, and was at least $200 cheaper than a comparatively specced T460s.

I got the silver version of the ThinkPad 13. Die-hard ThinkPad fans may condemn a non-black ThinkPad, but I think it looks great. Plus, the silver version came with the free 1080p IPS display upgrade. It also has a better choice of materials with an aluminium lid in place of the ABS plastic lid found on the black version of the ThinkPad 13.

I configured my laptop with the following specs for $720 (₹46500):

  • Intel Core i5-6300U Processor (3MB Cache, up to 3.00GHz)
  • Windows 10 Home 64 English
  • 8GB DDR4-2133 SODIMM
  • Intel HD Graphics 520
  • KYB SR ENG
  • 3cell 42Wh
  • UltraNav (TrackPoint and ClickPad) without Fingerprint Reader
  • Software TPM Enabled
  • 720p HD Camera
  • 256 GB Solid State Drive, SATA3 OPAL2.0 - Capable
  • 45W AC Adapter - US(2pin)
  • Intel Dual Band Wireless-AC(2x2) 8260, Bluetooth Version 4.1 vPro
  • 13.3" FHD (1920 x 1080) IPS Anti-Glare, 220 nits

I don't think the Windows 10 version is available in India presently. I believe a watered down Chromebook version is available on Amazon.

Since I was saving a fair bit of money by not buying the T460s, I maxed out the CPU with the i5-6300U though I don't ever see myself using vPro. The other upgrades were the RAM and SSD to 8GB and 256GB respectively.

The ThinkPad 13 Gen 2 and Gen 1 share the same chassis, so most of the parts of this review regarding build quality, I/O, and keyboard apply for the Gen 2 as well.

Build Quality

Pretty decent. At 1.4kg it's not the smallest or lightest 13.3" laptop available, but it's small enough to not matter. Given the number of I/O ports onboard, it's a very acceptable compromise.

You can immediately tell it isn't made from the fancy composites of the X1 Carbon or the magnesium alloy of the T/X/P series ThinkPads. Yet the plastic frame of the laptop is good quality and can easily stand out on its own. Compared to other ThinkPads I've tried, it feels considerably better built than the E470 and the X1 Yoga and is on par, if not better than the T440p. The P50 is more solid, though it must be kept in mind that the P50 is a much thicker laptop than the 13. The 13's build quality is on another level compared to the Lenovo IdeaPad G580 I had been using before this. Even after a year of heavy use, creaking on twisting the frame is non-existent and nothing feels like it's on the verge of falling apart like it did with the IdeaPad.

The hinges are nice and tight and the lid gives a satisfying thump on closing. Speaking of which, the aluminium lid is solid and protects the display nicely. However, it does give the laptop a heterogeneous look and feel with the rest of the frame being made of plastic.

One aside here, the 13 does have unique touches not found on higher end models. For example, the "i's" in the ThinkPad logo in both the lid and the palm-rest glow and the ThinkPad logo has a really nice brushed finish to it on the lid.

Opening the laptop for servicing and upgrading is not a good experience. The bottom base is held by 10 captive screws and is incredibly fiddly to pry open. I ended up scratching off some of the paint of the bottom cover in the two times I tried opening the base. That said, upgradability is much better than what you can find on comparable ultrabooks. Both DDR4 RAM slots, the M.2 SATA SSD (no NVMe on the Gen 1 unfortunately), the Wi-Fi card, and battery can be easily replaced.

The touchpad has its share of mechanical issues too. It is a clickpad design, similar to that found on MacBooks. Sometimes dust and debris would find its way in the gap between the base of the laptop and the touchpad causing the right button to physically stop clicking. It's a simple problem to fix and all it needs is a swab with a piece of thick paper under the touchpad. Nevertheless, I expected a more robust design. At least this problem isn't exclusive to the 13 as it seems to happen to the T450s as well.

Lastly, the plastic construction does have some drawbacks when it comes to scratches. A small section of the lid joining it to the hinges is plastic and is much more scratched up than the aluminium part of it. After a year, the base unit also had some paint rubbing off from the edges. I would suggest buying a carrying sleeve to avoid this from happening. The silver palmrests also have taken a slightly darker shade to the rest of the laptop.

Fortunately, all the damage is only cosmetic and the laptop has held up pretty well after being jostled around campus for more than a year.

I/O

Connectivity is a strong point for the ThinkPad 13. It's impressive to see how many ports there are on a laptop of this size! You get 3 USB-A ports running at USB 3.0 speeds, a full size HDMI port, an SD card reader in which the card goes all the way in, a headphone jack, and a USB-C port supporting charging and 2160p/60fps video out. There's also Lenovo's proprietary OneLink+ docking solution though I haven't tried this out yet.

On the subject of the USB-C port, a minor nitpick is that it's USB 3.1 Gen1 and doesn't support Thunderbolt. I still much prefer it to the option of having a full size Ethernet port as on the T460s. The USB-C port is so much more flexible.

I also discovered that all the USB ports support fast (faster?) charging. My phone charges as fast from the laptop's USB port as it does from a 5V/1.5A charger.

Wireless connectivity is serviced by the Intel 8260 Wi-Fi card, the same one used on all the top-spec 2016 ThinkPads. Speed and connection quality is good and it supports 802.11ac.

Display

You can't really go wrong with a 1080p IPS display on a 13.3" laptop.

The LG 1080p IPS display on ThinkPad 13 comes with a matte coating. Even for an IPS display, the viewing angles are excellent. 13.3" and 1080p is a sweet spot for sharpness and UIs scale fine in both KDE Neon and Windows 10. Thanks to the matte coating, the screen brightness is perfectly sufficient 90% of the time when I use it. I've hardly run into any instances where I felt the need to crank the screen brightness any higher.

Colour space coverage is reportedly below average for an IPS screen and it really shows when put side by side with a MacBook Pro or a Dell Inspiron 7460. That said, for coding and web-browsing, it's hard to ask for more.

Battery Life

The ThinkPad 13 is powered by a 42 Wh battery. It has degraded slightly from when I got the laptop - initially the battery was able to hold 46 Wh of charge but now it usually holds 40-41.5 Wh when full. I usually run several Firefox tabs open with some IDE's like VS Code, Atom, or Android Studio on KDE Neon. The laptop is very energy efficient and power draw rarely climbs above 6.5W at 30% brightness after enabling TLP. My old laptop used to idle at 17W and when people claim that Intel has gimped their CPUs since the IvyBridge era, I think they're disregarding the huge leaps in power efficiency.

In more than a year of owning it, I haven't once drained the battery. I've even gotten through 9-5 days at a summer internship solely off battery power. I typically get around 6-8 hours of battery life depending on the usage. I can't speak of how good or bad battery life is on Windows 10 as I've never used it for anything other than gaming, which brings me to my next section.

Performance

Subjectively, it's quite snappy. A lot of the performance improvements compared to the laptop I owned before this can be chalked up to the SSD. The one used in the ThinkPad 13 I got was the LiteOn L8H-256V2G and it gives about 540MB/sec in sequential reads and 300MB/sec in sequential writes. It's common knowledge how much faster SSDs are than HDDs, but I hadn't expected it the difference in swapping performance to be as enormous as it is. I can hardly tell when Linux starts filling up the swap space on the SSD as the degradation in performance is minimal.

The i5-6300U is sufficient for my needs and thermals in the ThinkPad 13 are good enough that the laptop doesn't need to throttle even under prolonged loads. The processor cools passively most of the time and the fan only kicks in under load. Even so, I'm hard pressed to hear the fan below 4000RPM. The CPU is able to overclock to 2.9 GHz on both cores but I've rarely seen it climb over 2.8 GHz. I'm not sure if this is due to the 15W TDP of the SoC or the laptop's conservative policy when it comes to cooling as core temperatures never climb over 75C on Prime95.

As far as programming goes, it's not struggled with anything I've thrown at it - from building large C++ codebases to running instances of IntelliJ and Android Studio (with the x86 Android emulator) together. The CPU is definitely faster than the i5-3230M in my old laptop.

Plus, for the first time, gaming isn't a terrible experience on integrated graphics if your expectations are low enough. I can get 30fps in Skyrim Special Edition at 720p and Medium graphics presets with 2X AA. DiRT3 and Mirror's Edge gave about 60fps at 720p with low settings. Sonic Mania works perfectly at 1080p 60fps. On the other hand, it struggled with returning a framerate greater than 30fps on Sonic Generations. Bear in mind that all this is with single channel 8GB RAM. Performance could be better with dual channel RAM.

Load times in all games are much lesser than they would be on a gaming console such as the PS4 thanks to the SSD.

Overall, as long as you don't mind running old games at 720p with some sacrifice in visual quality, the integrated Intel HD 520 should be adequate. At least it's good at smooth 4K video playback.

Keyboard

Really good! For years, I used to think that the AccuType keyboard on my Lenovo G580 was unbeatable, but that changed as soon as I laid my hands on the ThinkPad 13's keyboard. The keys are perfectly spaced, have amazingly deep travel, and have great texture. I really like the layout too - PgUp/PgDn + Ctrl makes it very easy to switch tabs and Fn and the left Ctrl can be swapped in the BIOS. I don't miss the lack of keyboard backlighting but I would really appreciate something like a ThinkLight found in older ThinkPad's.

It turns out that the keyboard on the 13 is exactly the same as the one on the T460s. I learned this by ruining the stock keyboard when doing my weekly round of sterilising all my electronic devices. AliExpress didn't have the silver 13 keyboard but the 13's FRU manual showed that the T460s keyboard (albeit black instead of silver) could be directly dropped in. After a month of waiting to get it shipped from China, it was a simple 10 minute job to replace the keyboard. Props to Lenovo for reusing FRUs and keeping the 13 so serviceable.

I'm pretty happy with the touchpad as well. It has a smooth surface and has a very satisfying physical click to it. Gestures work well and it's supported out of the box on Linux. The TrackPoint and click buttons have become indispensable for programming. However, I'm disappointed about how fast the TrackPoint's coating wears off as it becomes really difficult to use the shallow TrackPoint once this happens. ThinkPads really should ship with some replacement caps.

Linux Support

I've been using an Ubuntu derivative distro on my laptop ever since I got it. In the first few months, I used Kubuntu 16.04 and then later on I switched to KDE Neon (which, incidentally, is based on Ubuntu 16.04). Linux support is remarkable and everything "just works" right out of the box. Older versions of the 4.x Linux kernel had some issues with the touchpad "desyncing" every 15 minutes, but this has by and large been fixed over the last few months. Thanks to the excellent Mesa drivers, 3D performance of the Intel HD 520 is very stable and actually works better than my friend's NVidia Quadro GPU did with Nouveau drivers (not in terms of frame rate, but stability). The Trackpoint works well and all the function keys work perfectly. Also, as mentioned earlier, battery life under Linux is great. I'm curious to see if the OneLink+ works and if the USB-C can output video on Linux.

In my opinion, there's nothing you're missing by running Linux on the 13 instead of Windows when it comes to hardware support.

Conclusion

The ThinkPad 13 is a nice laptop, but it's not for everyone. It fits my workflow perfectly with the good keyboard, display, and build quality. However, not everyone needs as many I/O ports as the 13 has and they would probably be better served by the numerous ultrabooks with better displays and bigger batteries. On the other hand, some might jump for a higher-end ThinkPad model like the T460s/T470s with better build quality and connectivity options. That said, the ThinkPad 13 does a lot of things right and fills a niche in the market for small laptops with good keyboards and several ports. A lot of compromises made in the ThinkPad 13 are reasonable and if you can live with these compromises, it definitely is worthy of consideration for your next laptop.

]]>
<![CDATA[Akademy 2017]]>Last month, I attended KDE's annual conference, Akademy. This year, it was held in Almeria, a small Andalusian city on the south-east coast of Spain.

The name of the conference is no misspelling, it's part of KDE's age old tradition of naming everything to do with KDE with a 'k'.

]]>
http://arnavdhamija.com/2017/08/17/akademy-2017/5c360e6fce88a669ed3a7a94Thu, 17 Aug 2017 14:30:00 GMT

Last month, I attended KDE's annual conference, Akademy. This year, it was held in Almeria, a small Andalusian city on the south-east coast of Spain.

The name of the conference is no misspelling, it's part of KDE's age old tradition of naming everything to do with KDE with a 'k'.

Akademy was a collection of amazing, life-changing experiences. It was my first solo trip abroad and it taught me so much about travel, KDE, and getting around a city half-way accross the world from home.

Travel

My trip began from the recently renamed Kempegowda International Airport, Bangalore. Though the airport is small for an international airport, the small size of the airport works to its advantage as it is very easy to get around. Check-in and immigration was a breeze and I had a couple of hours to check out the loyalty card lounge, where I sipped soda water thinking about what the next seven days had in store.

The first leg of my trip was on a Etihad A320 to Abu Dhabi, a four hour flight departing at 2210 on 20 July. The A320 isn't particularly unique equipment, but then again, it was a rather short leg. The crew onboard that flight seemed to be a mix of Asian and European staff.

Economy class in Etihad was much better than any other Economy class product I'd seen before. Ample legroom, very clean and comfortable seats, and an excellent IFE system. I was content looking at the HUD-style map visualisation which showed the AoA, vertical speed, and airspeed of the airplane.

On the way, I scribbled a quick diary entry and started reading Part 1 of Sanderson's Stormlight Archive - 'The Way of Kings'.

Descent to Abu Dhabi airport started about 3:30 into the flight. Amber city lights illuminated the desert night sky. Even though it was past midnight, the plane's IFE reported an outside temperature of 35C. Disembarking from the plane, the muggy atmosphere hit me after spending four hours in the plane's air-conditioned cabin.

The airport was dominated by Etihad aircraft - mainly Airbus A330s, Boeing 787-8s, and Boeing 777-300ERs. There were also a number of other airlines part of the Etihad Airways Partners Alliance such as Alitalia, Jet Airways, and some Air Berlin equipment. As it was a relatively tight connection, I didn't stop to admire the birds for too long. I traversed a long terminal to reach the boarding gate for the connecting flight to Madrid.

The flight to Madrid was another Etihad operated flight, only this time on the A320's larger brethren - the A330-200. This plane was markedly older than the A320 I had been on the first leg of the trip. Fortunately, I got the port side window seat in a 2-5-2 configuration. The plane had a long take-off roll and took off a few minutes after 2am. Once we reached cruising altitude, I opened the window shade. The clear night sky was full of stars and I must have spent at least five minutes with my face glued to the window.

I tried to sleep, preparing myself for the long day ahead. Soon after waking up, the plane landed at Madrid Barajas Airport and taxied for nearly half-an-hour to reach the terminal. After clearing immigration, I picked up my suitcase and waited for a bus which would take me to my next stop - the Madrid Atocha Railway Station. Located in the heart of city, the Atocha station is one of Madrid's largest train stations and connects it to the rest of Spain. My train to Almeria was later that day - at 3:15 in the afternoon.

On reaching Atocha, I got my first good look at Madrid.

Akademy 2017

My facial expression was quite similar

Akademy 2017

I was struck by how orderly everything was, starting with the traffic. Cars gave the right of way to pedestrians. People walked on zebra crossings and cyclists stuck to the defined cycling lanes. Things taken for granted in developed countries looked like a world apart from Bangalore. Shining examples of Baroque and Gothic architecture were scattered among newer establishments.

Having a few hours to kill before I had to catch my train, I roamed around Buen Retiro Park, one of Spain's largest public parks. It was a beautiful day, bright and sunny with the warmth balanced out by a light breeze.

My heavy suitcase compelled me to walk slowly which made me take in as much as I could. Retiro Park is a popular stomping ground for joggers, skaters, and cyclists alike. Despite it being 11am on a weekday, I saw plenty of people jogging though the Park. After this, I trudged through some quaint neighbourhoods with cobbled roads and old apartment buildings dotted with small shops on the ground floor.

Maybe it was the sleep-deprivation or dehydration after a long flight, but everything felt so surreal! I had to pinch myself a few times - to believe that I had come thousands of miles from home and was actually travelling on my own in a foreign land.

I returned to Atocha and waited for my train. By this time, I came to realise that language was going to be a problem for this trip as very few people spoke English and my Spanish was limited to a few basic phrases - notably 'No hables Espanol' and 'Buenos Dias' :P Nevertheless, the kind folks at the station helped me find the train platform.

Trains in Spain are operated by state-owned train companies. In my case, I would be travelling on a Renfe train going till Almeria. The coaches are arranged in a 2-2 seating configuration, quite similar to those in airplanes, albeit with more legroom and charging ports. The speed of these trains is comparable to fast trains in India, with a top speed of about 160km/hr. The 700km journey was scheduled to take about 7 hours. There was plenty of scenery on the way with sloping mountain ranges and deserted valleys.

Akademy 2017

Big windows encouraged sightseeing

After seven hours, I reached the Almeria railway station at 10pm. According to Google Maps, Residencia Civitas, the hostel which KDE had booked for all the attendees was only 800m away - well within walking distance. However, unbeknownst to me, I started walking in the opposite direction (my phone doesn't have a compass!). This kept increasing the Google Maps ETA and only when I was 2km off track I realised something was very wrong. Fortunately, I managed to get a taxi to take me to Residencia Civitas - a university hostel where all the Akademy attendees would be staying for the week.

After checking in to Civitas, I made my way to the double room. Judging from the baggage and the shoes in the corner, someone had moved in here before I did. About half an hour later, I found out who - Rahul Yadav, a fourth year student at DTU, Delhi. Exhausted after an eventful day of travel, I switched off the lights and went to sleep.

The Conference

The next day, I got to see other the Akademy attendees over breakfast at Civitas. In all, there were about 60-70 attendees, which I was told was slightly smaller than previous years.

The conference was held at University of Almería, located a few kilometres from the hostel. KDE had hired a public bus for transport to and from the hostel for all the days of the conference. The University was a stone's throw from the Andalusian coastline. After being seated in one of the larger lecture halls, Akademy 2017 was underway.

Akademy 2017

Konqi! And KDE Neon!

The keynote talk was by Robert Kayne of Metabrainz, about the story of how MusicBrainz was formed out of the ashes of CDDB. The talk set the tone for the first day of Akademy.

The coffee break after the first two talks was much needed. I was grappling with sleep deprivation and jet lag from the last two days and needed all the caffeine and sugar I could get to keep myself going for the rest of the day. Over coffee, I caught up with some KDE developers I met at QtCon.

Throughout the day, there were a lot of good talks, notably 'A bit on functional programming', and five quick lightning talks on a variety of topics. Soon after this, it was time for my very own talk - 'An Introduction to the KIO Library'.

The audience for my talk consisted of developers with several times my experience. Much to my delight, the maintainer of the KIO Library, David Faure was in the audience as well!

Here's where I learned another thing about giving presentations - they never quite go as well as it seems to go when rehearsed alone. I ended up speaking faster than I planned to, leaving more than enough time for a QA round. Just as I was wary about, I was asked some questions about the low-level implementation of KIO which thankfully David fielded for me. I was perspiring after the presentation, and it wasn't the temperature which was the problem 😅 A thumbs up from David afterwards gave me some confidence that I had done alright.

Following this, I enjoyed David Edmundson's talk about Binding properties in QML. The next presentation I attended after this is where things ended up getting heated. Paul Brown went into detail about everything wrong with Kirigami's TechBase page. This drew some, for lack of a better word, passionate people to retaliate. Though it was it was only supposed to be a 10 minute lightning talk, the debate raged on for half-an-hour among the two schools of thought of how TechBase documentation should be written. The only thing which brought the discussion to an end was the bus for returning to Civitas leaving sharp at 8pm.

Still having a bit of energy left after the conference, I was ready to explore this Andalusian city. One thing which worked out nicely on this trip is the late sunset in Spain around this time of the year. It is as bright as day even at around 9pm and the light only starts waning at around 930pm. This gave Rahul and me plenty of time to head to the beach, which was about a 20 minute walk from the hostel.

Here, it struck me how much I loved the way of life here.

Unlike Madrid, Almeria is not a known as a tourist destination so most of the people living there were locals. In a span of about half an hour I watched how an evening unfolds in this city. As the sun started dipping below the horizon, families with kids, college couples, and high-school friends trickled from the beach to the boardwalk for dinner. A typical evening looked delightfully simple and laid-back in this peaceful city.

The boardwalk had plenty of variety on offer - from seafood to Italian cuisine. One place caught my eye, a small cafe with Doner Kebab called 'Taj Mahal'. After a couple of days of eating nothing but bland sandwiches, Rahul and I were game for anything with a hint of spice of it. As I had done in Madrid, I tried ordering Doner Kebab using a mixture of broken Spanish and improvised sign language, only to receive a reply from the owner in Hindi! It turned out that the owner of the restaurant was Pakistani and had migrated to Spain seven years ago. Rahul made a point to ask for more chilli - and the Doner kebabs we got were not lacking in spice. I had more chilli in that one kebab than I normally would have had in a week. At least it was a change from Spanish food, which I wasn't all that fond of.

Akademy 2017

View from the boardwalk

Akademy 2017

The next day was similar to the first, only a lot more fun. I spent a lot amount of time interacting with the people from KDE India. I also got to know my GSoC mentor, Boudhayan Gupta (@BaloneyGeek). The talks for this day were as good as the ones yesterday and I got to learn about Neon Docker images, the story behind KDE's Slimbook laptop, and things to look forward to in C++17/20.

The talks were wrapped up with the Akademy Awards 2017.

Akademy 2017

David Faure and Kevin Ottens

There were still 3 hours of sunlight left after the conference and not being ones to waste it, we headed straight for the beach. Boudhayan and I made a treacherous excursion out to a rocky pier covered with moss and glistening with seawater. My well-worn sandels were the only thing keeping me from slipping onto a bunch of sharply angled stones jutting out from the waves. Against my better judgement, I managed to reach the end of the pier only to see a couple of crabs take interest in us. With the tide rising and the sun falling, we couldn't afford to stay much longer so we headed back to the beach just as we came. Not long after, I couldn't help myself and I headed into the water with enthusiasm I had only knew as a child. Probably more so for BaloneyGeek though, who headed in headfirst with his three-week-old Moto G5+ in his pocket (Spoiler: the phone was irrevocably damaged from half a minute of being immersed in saltwater). In the midst of this, we found a bunch of KDE folks hanging out on the beach with several boxes of pizza and bottles of beer. Free food!

Exhausted but exhilarated, we headed back to Civitas to end another very memorable day in Almeria.

Akademy 2017

Estacion Intermodal, a hub for public transport in Almeria

With the talks completed, Akademy 2017 moved on to its second leg, which consisted more of BoFs (Birds of a Feather) and workshops.

The QML workshop organised by Anu was timely as my relationship with QML has been hot and cold. I would always go in circles with the QML Getting Started tutorials as there aren't as many examples of how to use QtQuick 2.x as there are with say, Qt Widgets. I understood how to integrate JavaScript with the QML GUI and I will probably get around to making a project with the toolkit when I get the time. Paul Brown held a BoF about writing good user documentation and deconstructed some more pretentious descriptions of software with suggestions on how to avoid falling into the same pitfalls. I sat on a few more BoFs after this, but most of the things went over my head as I wasn't contributing to the projects discussed there.

Feeling a bit weary of the beach, Rahul and I decided to explore the inner parts of the city instead. We planned to go to the Alcazaba of Almeria, a thousand-year-old fortress in the city. On the way, we found a small froyo shop and ordered a scoop with chocolate sauce and lemon sauce. Best couple of euros spent ever! I loved the tart flavour of the froyo and how it complemented the toppings with its texture.

This gastronomic digression aside, we scaled a part of the fort only to find it locked off with a massive iron door. I got the impression that the fort was rarely ever open to begin with. With darkness fast approaching, we found ourselves in a dodgy neighbourhood and we tried to get out as fast as we could without drawing too much attention to ourselves. This brought an end to my fourth night in Almeria.

Akademy 2017

View from Alcazaba

The BoFs continued throughout the 25th, the last full day of Akademy 2017. I participated in the GSoC BoF where KDE's plans for future GSoCs, SoKs, and GCIs were discussed (isn't that a lot of acronyms!). Finally, this was one topic where I could contribute to the discussion. If there was any takeaway from the discussion for GSoC aspirants, it is to start as early as you can!

I sat on some other BoFs as well, but most of the discussed topics were out of my scope. The Mycroft and VDG BoF did have some interesting exchange of ideas for future projects that I might consider working on if I get free time in the future.

Rahul was out in the city that day, so I had the evening to explore Almeria all by myself.

I fired up Google Maps to see anything of interest nearby. To the west of the hostel was a canal I hadn't seen previously so I thought it would be worth a trip. Unfortunately, because of my poor navigation skills and phone's lack of compass, I ended up circling a flyover for a while before ditching the plan. I decided to go to the beach as a reference point and explore from there.

What was supposed to be a reference point ended up becoming the destination. There was still plenty of sunlight and the water wasn't too cold. I put one toe in the water, and then a foot.

And then, I ran.

Running barefoot alone the coastline was one of the best memories I have of the trip. For once, I didn't think twice about what I was doing. It was pure liberation. I didn't feel the exertion or the pebbles pounding my feet.

Akademy 2017

Almeria's Beaches

The end of the coastline had a small fort and a dirt trail which I would've very much wanted to cycle on. After watching the sun sink into the sea, I ran till the other end of the boardwalk to find an Italian restaurant with vegetarian spinach spaghetti. Served with a piece of freshly baked bread, dinner was delicious and capped off yet another amazing day in Almeria.

Akademy 2017

Dinner time!

Day Trip

The 26th was the final day of the conference. Unlike the other days, the conference was only for half a day with the rest of the day kept aside for a day trip. I cannot comment on how the conference went on this day as I had to rush back to the hostel to retrieve my passport, which was necessary to attend the day trip.

Right around 2 in the afternoon we boarded the bus for the day trip. Our first stop was the Plataforma Solar de Almería, a solar energy research plant in Almeria. It houses some large heliostats for focussing sunlight at a point on a tower. This can be used for heating water and producing electricity.

There was another facility used for testing the tolerance of spacecraft heat shields by subjecting them to temperatures in excess of 2000C by focussing sunlight.

Akademy 2017

Heliostats focus at the top of the tower

The next stop was at the San José village. Though not too far from Almeria, the village is frequented by tourists much more than Almeria is and has a very different vibe. The village is known for its beaches, pristine clear waters, and white buildings. I was told that the village was used in the shooting of some films such as The Good, The Bad, and The Ugly.

Akademy 2017

Our final stop for the day was at the Rodalquilar Gold Mine. Lost to time, the mine had been shut down in 1966 due to the environmental hazards of using cyanide in the process to sediment gold. The mine wouldn't have looked out of place in a video-game or an action movie, and indeed, it was used in the filming of Indiana Jones and the Last Crusade. There was a short trek from the base of the mine to a trail which wrapped around a hill. After descending from the top we headed back to the hostel.

Akademy 2017

This concluded my stay in Almeria.

Madrid

After checking out of the hostel early the next morning, I caught a train to Madrid. I had a day in the city before my flight to Bangalore the next day.

I reached Atocha at about 2 in the afternoon and checked in to a hotel. I spent the entire evening exploring Madrid on foot and an electric bicycle through the BiciMAD rental service.

Photo Dump

Akademy 2017

Akademy 2017

Akademy 2017

Akademy 2017

Akademy 2017

Akademy 2017

Akademy 2017

Akademy 2017

Akademy 2017

Akademy 2017

Akademy 2017

Akademy 2017

The Return

My flight back home was on the following morning, on the 28 July. The first leg of the return was yet again, on an Etihad aircraft bound for Abu Dhabi. This time it was an A330-300. It was an emotional 8 hour long flight - with the memories of Akademy still fresh in my mind. To top it off, I finished EarthBound (excellent game!) during the flight.

Descent into Abu Dhabi started about half an hour before landing. This time though, I got to see the bizarre Terminal 1 dome of the Abu Dhabi airport. The Middle East has always been a mystical place for me. The prices of food and drink in the terminal were hard to stomach - 500mL of water was an outrageous 8 UAE Dirhams (₹140)! Thankfully it wasn't a very long layover, so I didn't have to spend too much.

Akademy 2017

Note to self: may cause tripping if stared at for too long

The next leg was a direct flight to Bangalore, on another Etihad A330. Compared to all the travel I had done in the last two days, the four hour flight almost felt too short. I managed to finish 'Your Lie in April' on this leg.

I had mixed emotions on landing in Bangalore - I was glad to have reached home, but a bit sad that I had to return to college in only two days.

Akademy 2017 was an amazing trip and I am very grateful to KDE for letting me present at Akademy and for giving me the means of reaching there. I hope I can make more trips like these in the future!

Subscribe to my KDE RSS feed here!

]]>
<![CDATA[On College Students]]>As a school kid, I was always fascinated by meeting college students from top tier colleges. There was something surreal in the way they talked and they confidently stood with T-shirts proudly proclaiming the name of their alma mater. I used to have the impression that no matter what came

]]>
http://arnavdhamija.com/2017/07/20/on-college-students/5c360e6fce88a669ed3a7a93Thu, 20 Jul 2017 15:00:00 GMTAs a school kid, I was always fascinated by meeting college students from top tier colleges. There was something surreal in the way they talked and they confidently stood with T-shirts proudly proclaiming the name of their alma mater. I used to have the impression that no matter what came in the future, there was a certain justification of their abilities and intelligence. Perhaps society had conditioned me to think as such. Maybe their apparent confidence was a just a projection of how I thought it would feel to be in a top tier college.

It struck me just recently that everyone in my college is of the same surreal breed I used to look upto. But now, the magic is lost on me. It's no longer the case that simply being admitted to a Tier-1 college would give one opportunities otherwise impossible. At the same time, the name of the college on your degree matters little if you haven't done enough legwork in college to justify being called an "Engineer".

]]>
<![CDATA[KIO Stash - Shipped!]]>It's been almost a year since I finished my GSoC project for implementing discontinuous file selections as a KIOSlave.

The ioslave is now officially shipped in the KDE ExtraGear Utils software package and it can be downloaded from here: https://download.kde.org/stable/kio-stash/kio-stash-1.0.tar.xz.mirrorlist

]]>
http://arnavdhamija.com/2017/07/04/kio-stash-shipped/5c360e6fce88a669ed3a7a92Tue, 04 Jul 2017 06:00:00 GMT

It's been almost a year since I finished my GSoC project for implementing discontinuous file selections as a KIOSlave.

The ioslave is now officially shipped in the KDE ExtraGear Utils software package and it can be downloaded from here: https://download.kde.org/stable/kio-stash/kio-stash-1.0.tar.xz.mirrorlist

Introduction

Selecting multiple files in any file manager for copying and pasting has never been a pleasant experience, especially if the files are in a non-continuous order. Often, when selecting files using Ctrl+A or the selection tool, we find that we need to select only a subset of the required files we have selected. This leads to the unwieldy operation of removing files from our selection. Of course, the common workaround is to create a new folder and to put all the items in this folder prior to copying, but this is a very inefficient and very slow process if large files need to be copied. Moreover Ctrl+Click requires fine motor skills to not lose the entire selection of files.

This is an original project with a novel solution to this problem. My solution is to add a virtual folder in all KIO applications, where the links to files and folders can be temporarily saved for a session. The files and folders are "staged" on this virtual folder. Files can be added to this by using all the regular file management operations such as Move, Copy and Paste, or by drag and drop. Hence, complex file operations such as moving files across many devices can be made easy by staging the operation before performing it.

Project Overview

This project consists of the following modules. As there is no existing implementation in KIO for managing virtual directories, all the following modules were written completely from scratch.

KIO Slave

The KIO slave is the backbone of the project. This KIO slave is responsible for interfacing with the GUI of a KDE application and provides the methods for various operations such as copying, deleting, and renaming files. All operations on the KIO slave are applied on a virtual stash filesystem (explained below). These operations are applied through inter process communication using the Qt's D-Bus API.

The advantage of the KIO slave is that it provides a consistent experience throughout the entire KDE suite of applications. Hence, this feature would work with all KIO compatible applications.

Stash File System

The Stash File System (SFS) is used for virtually staging all the files and directories added to the ioslave. When a file is copied to the SFS, a new File Node is created to it under the folder to which it is copied. On copying a folder, a new Directory Node is created on the SFS with all the files and directories under it copied recursively as dictated by KIO. The SFS is a very important feature of the project as it allows the user to create folders and move items on the stash ioslave without touching the physical file system at all. Once a selection is curated on the ioslave, it can be seamlessly copied to the physical filesystem.

The SFS is implemented using a QHash pair of the URL as a key, containing the location of the file on the SFS and the value containing a StashNodeData object which contains all the properties (such as file name, source, children files for directories) of a given node in SFS.

Memory use of the SFS is nominal on a per file basis - each file staged on the SFS requires roughly 300 bytes of memory.

Stash Daemon

The Stash File System runs in the KDE Daemon (kded5) container process. An object of the SFS is created on startup when the daemon is initialized. The daemon responds to calls from the ioslave communicated over the session bus and creates and removes nodes in the SFS.

Installation

Make sure you have KF5 backports with all the KDE dependency libraries installed before you build!

Download the tarball to your favorite folder, extract, and run:

mkdir build
cd build
cmake -DCMAKE_INSTALL_PREFIX=/usr -DKDE_INSTALL_USE_QT_SYS_PATHS=TRUE ..
make
sudo make install
kdeinit5

Result

Open Dolphin and set the path as stash:/. This directory is completely 'virtual' and anything added to it will not consume any extra disk space. All basic file operations such as copy, paste, and move should work.

Folders can be created, curated, and renamed on this virtual folder itself. However, as this is a virtual directory, files cannot be created on it.

Copying from remote locations such as mtp:/ may not work however. Please report any bugs for the same on http://bugs.kde.org/.

]]>
<![CDATA[Quadcopters - A Hitchhiker's Guide to the Sky]]>We have liftoff!

liftoff

Cue me trying to not get my fingers cut off

The quadcopter has been built! I had some problems during the building process, so this is a guide which will hopefully make things easier for people to start off with making their own quadcopters.

I suggest going

]]>
http://arnavdhamija.com/2017/06/13/quadcopters-a-hitchhikers-guide-to-the-sky/5c360e6fce88a669ed3a7a8dTue, 13 Jun 2017 04:00:00 GMT

We have liftoff!

Quadcopters - A Hitchhiker's Guide to the Sky

Cue me trying to not get my fingers cut off

The quadcopter has been built! I had some problems during the building process, so this is a guide which will hopefully make things easier for people to start off with making their own quadcopters.

I suggest going through all these websites thoroughly before purchasing a single part for your quadcopter. It's not an extensive list, so I'll keep updating it as I find more useful links:

If you live in India, the following sites are good places for shopping for parts:

RCBazaar and RCDhamaka have stores in Bangalore. Robu.in is based in Pune.

Hardware Assembly

For clarity, the components I have used in my build are:

  • 1045 Propellers x4
  • REES52 DJI F450 Frame
  • REES52 CC3D F1 Flight Computer
  • REES52 1000KV A2212 30A Brushless Motor x4
  • REES52 SimonK 30A Electronic Speed Controllers x4
  • APM 4-axis Power Distribution Board Type-B1
  • SunRobotics 2200mAh 3S 35C LiPo Battery
  • FlySky FSCT6B Computer Transmitter

Most of these parts can be picked straight off Amazon. REES52 has a good bundle for a pair of propellers, 1000 KV motor, and ESC which I would recommend buying to keep costs low.

Frame

The first thing you would want to do is start off with the frame of the quadcopter. If you go for a DJI F450 frame (or one of its hundred clones), you will get a box with four arms (technically called booms) for mounting the motors and ESCs, a base which doubles as a power distribution board, and a board to hold the top of the copter together. It's pretty simple to set up and all you need is two Allen keys to screw everything together.

I suggest labeling each boom with a number and the direction in which its motor is supposed to spin. The below diagram is a good starting point:

Quadcopters - A Hitchhiker's Guide to the Sky
Speaking about labeling, label everything. It only takes a few seconds and it can save many hours of frustration later on. Of note, is the handedness (is there a better term?) of the propeller. Left-handed propellers turn anticlockwise and right-handed propellers turn clockwise. Mounting a propeller the wrong way will cause it to produce thrust in the opposite direction.

Motors

Once you've setup the frame, the next thing you should do is take a look at the motors. These motors are brushless motors and they are very different from brushed motors. For starters they have three wires instead of the two on brushed motors. Another peculiarity about these motors is that they are out-runner motors. This means that the case of the motor rotates with the propeller and not the motor shaft alone.

Quadcopters - A Hitchhiker's Guide to the Sky
Despite being a lot more power efficient than brushed motors, these motors can generate a great deal of heat when running at full power. I would not recommend testing it at full power on the ground as there won't be any airflow to cool the motor down. This can permenantly damage your motor, and if you're unlucky, your ESC as well.

Anyway, the motors can be screwed directly into the booms of the quadcopter. Most motors come with a few accessories - an adapter for the base to mount it on to differently keyed frames and a propeller shaft with a nose tip. Don't worry if the propeller shaft seems to wobble or come loose. It's designed to only be fully secured when the propellers are mounted on the motor.

The direction of rotation is determined by the ESC, so we don't need to worry about that at the moment.

Electronic Speed Controllers (ESCs)

The ESCs link the flight computer (FC) with the motors. The input to the ESC is PWM fed over the BEC connector. Each ESC have its own 8-bit SoC to send signals to the brushless motor to turn at the desired RPM. In the FC settings, you can adjust the ESC update frequency, which is usually between 50Hz and 490Hz.

A brief digression on the BEC - the thinner wires at the power input side of the ESC. Known as the Battery Eliminator Circuit, it steps down the voltage of the battery to a comfortable 5V for powering components such as the FC or any other devices you may want to connect to the aircraft. The FC only needs power from one BEC and it may even be harmful to connect more than one BEC to the FC. There are some suggestions here on how to take care of this problem.

The ESCs regulate the amount of power going to the motor depending on throttle input and can get very hot as well. Mounting the ESCs under the booms will give it sufficient airflow to cool down properly. You want to make sure that the three wires with female bullet connectors is facing the male bullet connectors of the motor. The other side with the male connector and the BEC goes to the power distribution board of the quadcopter.

Mount the ESCs on to the booms with rubber bands, or better, zip-ties.

As with all electronics with large capacitors, ESCs can have spectacular explosions when things go wrong. Short circuits or using the wrong polarity on the power input side of the ESC can cause it to get damaged in a matter of seconds.

On the other hand, things are a lot more flexible on the power output side of the ESC. It is possible to connect the three output connectors in any order to the motor, but a good rule of thumb is to connect the middle wire of the ESC to the ground wire of the motor and to interchange the other two wires to reverse the direction of rotation.

Quadcopters - A Hitchhiker's Guide to the Sky

Power Distribution Board

The frame of your quadcopter probably will have a power distribution board of its own with solder points. However, I used a separate power distribution board with the T-type connectors for the ESCs and the battery. It also has ports for the BECs and corresponding signal wires with a single 5V DC output.

This went right under the base of the quadcopter. It's a bit of tight fit to get all the ESC wires under the base. Again, use zip-ties to secure it to the base.

The ground clearance is quite low with the power distribution board attached. You can buy a landing gear or make your own to increase this.

I made a DIY landing gear for my own quadcopter using a badminton shuttlecock tube. It absorbs shocks nicely and gives me just enough ground clearance to mount more components.

Quadcopters - A Hitchhiker's Guide to the Sky

Flight Computer and Battery

Now for the good stuff, go ahead and mount the CC3D on the top frame of the quadcopter. Most CC3D's will come with some plastic adapters and a sticky sheet to stick the CC3D on the quadcopter frame. Take note of the arrow on the CC3D and make sure it is facing the direction in which you want your quadcopter to fly forward. Make sure you leave enough room for keeping the mini USB port and the servo header pins accessible!

After binding the transmitter to the receiver, plug it into the receiver port of the CC3D. If you use the FlySky CT6B transmitter/receiver, the receiver probably won't work when directly connected to the CC3D on USB power. If so, you can use an Arduino's 5V and GND to power the receiver for testing it. When flying the quadcopter on Li-Po power, the CC3D will be powered by the ESC BECs which will give it enough headroom to power the receiver.

The receiver can be mounted pretty much anywhere as it is quite small. I mounted the receiver on my quadcopter's boom.

The final step is to add the Li-Po battery to the frame. Some prefer to mount the battery to the base of the quadcopter, but I found mounting the battery right under the top frame (under the CC3D) from E-W was much better for the quadcopter's stability. My guess is that this keeps the vertical CoG of the aircraft closer to the line of application of force, reducing the torque when it makes a maneuver.

Again, the place where you mount your battery is dependent on the size of the battery. Secure the battery with as many zip-ties as you want. There's no such thing as using too many zip-ties to secure components on quadcopters.

Putting It Together

ESC Calibration

Before you start configuring your CC3D, it's a good idea to make sure your sure your ESC and motors are working properly. For this, connect the ESC directly to the battery and the ESC's BEC straight into the 3rd channel (the throttle channel ) of your receiver. Keep the transmitter switched on with zero throttle input. On connecting the battery, you will hear a calibration beep from the motor. Put the throttle to max power and hold till you hear another calibration beep. After bringing the throttle back to zero input, you should now be able to drive the motor by varying the throttle input.

In case the motor spins the wrong way, just swap the red and yellow wires of the motor.

Cable Management

This isn't really cable management, but I couldn't think of a better title for this section 😅

Connect the ESCs' BECs to the CC3D. The order will have to be changed later when setting up the CC3D so don't worry about that for now. I looped the cables through the holes in the top of the frame so they weren't hanging off the side of the quadcopter. The white wire should face upwards.

The receiver is quite straightforward to setup. The first cable has three leads, one for signal and two for power. This will go in the Channel 1 input of the receiver. The rest of the cables have only one signal lead. Just connect this in the order to which it is connected on the the receiver port side of the leads.

The male connectors for the ESC can go right into the power distribution board.

Software Configuration

The CC3D is a nifty micro-controller. It can be flashed with different FC firmwares and can be extended with GPS and Telemetry capabilities. For now, we will work on a much more humble task - flashing it with a FC firmware.

As the CC3D has been around for a while, it has a good amount of community support in terms of open-source FC firmwares. Two popular ones are Cleanflight and LibrePilot, built out of the ashes of the now defunct OpenPilot.

The firmware gives the CC3D brains to control the aircraft and make decisions on the accelerometer and gyroscope readings. It has a sophisticated PID algorithm to do so.

Both firmwares come with cross-platform desktop apps for configuring the FC. Cleanflight uses a shiny Chrome web-app and LibrePilot has a more traditional Qt desktop app. I liked Cleanflight at first, but flashing it on the CC3D is a mess and I couldn't get it to work properly. On the other hand, LibrePilot is a joy to flash on the CC3D. The desktop app even auto-updates the CC3D's firmware if it is out of date. Therefore, being unable to flash Cleanflight, I setup the quadcopter in LibrePilot instead.

LibrePilot has a pretty easy 'Vehicle Setup Wizard' which, if followed correctly, will get your quadcopter to airworthy shape 95% of the time. You would want to keep the props off during the whole procedure, especially during ESC calibration.

Following this, you can mount the propellers by selecting the right adapter from the adapter kit each prop comes with it. Stick it in to the reverse side of the propeller and mount the propeller such that the side with the pitch and diameter printed on it is facing upwards. You can then mount the nose tip which comes with the motor with a thin screwdriver or Allen key. All the thrust of the quadcopter is generated from the propellers, so you'll want to make sure that this is quite tight. This should make the propeller shaft impossible to pull upwards, and if it isn't, you would want to return the motor before you have a catastrophic accident. The last thing you want is a propeller disloding itself from your quadcopter midflight.

Fly!

Or at least try to if you're a beginner :P If you're using LibrePilot, the default setting has stability assist which makes flying easier. For the more experienced (or gung ho) type of pilot, you can change the settings to use Rate/Acro mode to get full manual control of the aircraft.

Another thing worth looking at is tuning the PID values of the FC for better performance. However, LibrePilot's default values are good enough so this isn't necessary.

And that's it for this tutorial. I'll be writing more as I get more experienced with flying my quadcopter.

]]>
<![CDATA[The Y2038 Glitch]]>I had written this article a bit more than a year ago for a college magazine. This topic has been done to death in a Q/A format, so if you're looking for something new about Y2038, you probably won't find it here

Time has always been a finicky thing

]]>
http://arnavdhamija.com/2017/06/09/the-y2038-glitch/5c360e6fce88a669ed3a7a8cFri, 09 Jun 2017 10:00:00 GMTI had written this article a bit more than a year ago for a college magazine. This topic has been done to death in a Q/A format, so if you're looking for something new about Y2038, you probably won't find it here

Time has always been a finicky thing to deal with. Our perception of time without any stimulus is limited to less than an hour. We would be severely crippled without our abundance of electronic devices synchronized by internet atomic clocks. Unfortunately, these electronic devices on which we're so reliant on are heading towards a time based computer disaster on the same scale as the Y2K bug of this millennium - known as the Y2038 problem.

How do computers measure time and what is the Y2038 bug?

All UNIX derivative systems (including your iPhone or Android smartphone) measure time by counting the number of milliseconds that have elapsed from the 00:00 UTC, 1 January 1970. This day and time is known as the UNIX epoch. While this method of measuring time has served us well for a number of years, it faces a severe limitation - the time_t int variable (an int variable is a part of a program used to hold integral values) in UNIX used to store the number of milliseconds from the epoch is only a 32-bit data type on older 32-bit computers. This means the time_t variable can only store up to 231 - 1 milliseconds before the counter overflows.

Note: 32-bit systems can hold a 64-bit int by splitting it into two words each of length 32-bits. However, the default int size on 32-bit systems is only 32-bits and a 64-bit int has to be coded separately.

So how long is it between before this overflow?

Exactly 231 - 1 milliseconds after 00:00 UTC, 1 January 1970, which is 03:14:07 UTC, 19 January 2038.

So what will happen?

The overflow in the integer value will cause time_t to reset to - (231) milliseconds. This date is 20:45 UTC, December 1901. This is actually a much more serious situation than just your computer showing a funny date. This has disastrous consequences for systems with 32-bit CPUs as BIOS software, file systems, databases, network security certificates and a wide variety embedded hardware will fail to work. With huge critical machinery such as old electrical power stations controlled by a 32-bit computer, an unmitigated catastrophe is unavoidable.

Oh no! Does that mean my laptop and smartphone will stop working as well?

Depends. The 32-bit time_t variable has been deprecated and replaced by a 64-bit time_t variable which will tide us over for the next 2 billion years. 64-bit CPUs are the standard nowadays and most computers running a 64-bit OS on a 64-bit CPU will not face any such consequences from the Y2038 bug as they use the 64-bit length time_t variable. However, smartphones have only recently shifted to a 64-bit process and 32-bit smartphones and computers will be affected if they are not coded for the 64-bit int length. That is of course, only if you are still using it after 23 years from today :P

But how did this even happen? Why couldn't we just use a 64-bit time_t variable in the first place? It's only 4 bytes bigger!

Short answer: Legacy.

Long answer: The UNIX kernel was created in AT&T and Bell labs in the 1970s. There were many competent programmers working on the project such as Dennis Ritchie of C fame. Initially, Bell engineers used a different method for calculating time, but they found that the counter would only work for 2.5 years. As UNIX was planned to be a long project, they changed the time counter to a 4-byte integer. This of course was also limited to a very finite amount of time. However in the 1970s, the consensus was that computers weren't going to stick around for that much longer and a 60-odd year window was "good enough".

Couldn't they just have just used a 64-byte int anyway?

Not really. Computer resources were scarce in the 1970s and 8-bytes of memory to store time_t was more memory than what engineers were willing to give.

Wait a second! Why doesn't time_t use the unsigned int? Surely UNIX wasn't programmed expecting us to go back in time!

The time_t variable uses the signed int to account for dates before the UNIX epoch. Dates before 1 January, 1970 are represented in a negative number of milliseconds which is why an overflow would take the computer's date to 1901.

But this is 2017! Can't we just upgrade the time_t variable to 64-bits?

Yes we can. In all new software and 64-bit operating systems, the time_t variable has been changed to 8-bytes. The problem is that all legacy software compiled with other older compilers may be incompatible for recompiling with a newer compiler for an 8-byte int value. The problem doesn't just stop there. Many embedded systems and microcontrollers (including the popular Arduino ATMega based platform) use 16-bit CPUs which simply cannot support an 8-byte integer in a 2-byte word length. Some hardware applications may use proprietary firmware which won't receive updates.

How can we fix the Y2038 problem?

There isn't any one-solution-fits-all approach for this since it affects several computer architectures, hardware, and software. While it might be easier to just upgrade all old computers with new 64-bit capable ones, there are still many areas where old 32-bit code is still prevalent. Furthermore, the cost of upgrading all the computer hardware is just too exorbitant to be covered.

If we look at the lessons learned from the past, the Y2K bug was rectified because of the media hype pressuring businesses to update their software to accommodate 4-digit years. Although the Y2038 bug is much more critical and harder to understand than the Y2K bug, there is hope that the same pressure will bring change.

But for now, only time will tell.

Credits: xkcd

]]>
<![CDATA[Quadcopters - The Beginning!]]>I've been gravitating more towards working on electronics projects. It's a nice change from pecking the keyboard all day and it has plenty of interesting challenges of its own. The biggest difference I found is that you just can't take your hardware working correctly for granted. Things break and catch

]]>
http://arnavdhamija.com/2017/05/26/quadcopters-the-beginning/5c360e6fce88a669ed3a7a8eFri, 26 May 2017 04:00:00 GMTI've been gravitating more towards working on electronics projects. It's a nice change from pecking the keyboard all day and it has plenty of interesting challenges of its own. The biggest difference I found is that you just can't take your hardware working correctly for granted. Things break and catch fire all too easily if you're not careful with what you're doing.

My plan is to make a quadcopter from scratch (a rather loose term) and give it some ability to navigate unsupervised. This approach is (aptly?) called Simultaneous Localisation and Mapping, or SLAM. SLAM is by no means a cheap affair, be it in terms of hardware or software complexity. Using SLAM outdoors normally requires some kind of 3D localiser (such as a depth camera or LIDAR), GPS, and a fairly beefy computer to map the environment.

Of course, I'm setting my expectations low, and I will be very happy if I can get something out of this which is 10% as good as a human flying quadcopter by hand.

It all begins with the hardware and I'm going with this parts list:

  • 1045 Propellers x4
  • REES52 DJI F450 Frame
  • REES52 CC3D F1 Flight Computer
  • REES52 1000KV A2212 30A Brushless Motor x4
  • REES52 SimonK 30A Electronic Speed Controllers
  • SunRobotics 2200mAh 3S 35C LiPo Battery
  • FlySky FSCT6B Computer Transmitter

The amount of embedded systems tech in these components is astounding. Each motor is driven by an ESC, which is essentially a full-fledged 8-bit SoC. The CC3D Flight Computer is an engineering marvel in it's own right - it comes with a 32-bit/72MHz flashable microcontroller, an MPU6000 gyroscope/accelerometer for the IMU, and has expansion ports for GPS and Telemetry data. Even the battery is remarkable - it's capable of discharging 850W at peak performance.

What I'm interested in, however, is controlling the CC3D in real-time using an Arduino. Typically, the CC3D is connected to a radio receiver which is wirelessly linked to a transmitter remote which is controlled by hand. The reciever is interfaced to the CC3D by outputting a PWM to each of the 6 channels when the transmitter is operated. Each channel is usually dedicated to only one axis of movement. For example, there are four dedicated channels for throttle control, yaw, roll, and pitch.

The idea behind this is that one day, I will have a lightweight computer (anyone willing to donate an NUC?) on the quadcopter doing SLAM in real-time. The computer will send throttle, pitch, yaw, and roll commands to the Arduino. The Arduino in turn will feed the CC3D with control signals to execute the maneouver plan. It all may seem like a Rube Goldberg machine, but it's a lot easier than building my own flight computer to do the same.

To kick things off, I had to find the PWM frequency the receiver uses for controlling the ESC. To do this, I connected a pin from the output channel of the receiver to an ad-hoc Arduino Leonardo oscilloscope. Using the Processing library, we can get a 0-5V reading of the value on the A0 analog input of the Arduino.

Capture2
Not a bad start! The peaks are a pretty clear sign that it's a PWM with a low duty cycle. Using a ruler and some more code, I found that the peak to peak time is about 20ms, which works out to be a PWM frequency of 50Hz. This is in line with some websites speculating it works similarly to a servo control.

The length of time at the peak gives us the signal we need to send using the Arduino. Using some more analysis of the waveform, it looked like the signal were about 1-2ms long. A bit more research yielded that this is very similar to the waveform used for controlling servos. Fortunately, the Arduino has a Servo library meant just for this purpose. Using the writeMicroseconds() function, I got a pretty similar waveform:

Capture7
I would go as far to say that this is an even cleaner signal than what the reciever outputs. I needed a 5V->3.3V voltage divider for the output, so I used 3 identical resistors in series to do the job.

Next thing was to get the CC3D to accept the input. For this it requires manual calibration of the transmitter. In Librepilot, the procedure is quite straightforward - you need to shift the sticks through their minimum, maximum, and neutral outputs to calibrate the CC3D for your transmitter. To do this, I mounted a push button on the Arduino which would cycle through 3 states each time it is pressed - 1100uS, 1500uS, and 1900uS. Not very elegant. Then again, if it's stupid but works, it ain't stupid.

I will be building the rest of the quadcopter once the parts arrive. You can find the code for the Arduino part here: https://github.com/shortstheory/quadcopter-experiments/blob/master/tristate.ino

]]>
<![CDATA[Reading List]]>Recently, I've been trying to read more long form literature. It's been an interesting experience to wane myself off of social media and that experience certainly deserves a post of its own soon. Until then, here are some of my thoughts on the books I finished reading this year:

The

]]>
http://arnavdhamija.com/2017/03/12/reading-list/5c360e6fce88a669ed3a7a91Sun, 12 Mar 2017 18:20:00 GMTRecently, I've been trying to read more long form literature. It's been an interesting experience to wane myself off of social media and that experience certainly deserves a post of its own soon. Until then, here are some of my thoughts on the books I finished reading this year:

The Checklist Manifesto - Atul Gawande

The title gives away the premise of the book - which is about the art of managing and creating efficient checklists. This may not exactly sound like a jaw-clenching thriller, but Atul Gawande manages to keep the pages turning with interesting anecdotes about maintaining checklists. A surgeon by profession, Gawande lists instances about how his carefully curated surgery checklist saved lives in the operating theater. As an aviation geek, I enjoyed the story of how creating a pre-flight checklist for the Boeing 299 (now known as the Boeing B-17) saved Boeing from the brink of bankruptcy in the fiercely competitive aviation market of the 1940s. There are narratives about the flip side too. For example, verbose checklists with too many steps for the user to follow, may do more harm than having no checklist at all.

For the most part, the book reads as a collection of anecdotes of varying quality. Pointers on creating a good checklist is scattered too thin throughout the book to be useful. That said, my opinion of this book is probably not the best one to go by as I finished the book over a period of months which broke the continuity of the book.

Freakonomics: A Rogue Economist Explores the Hidden Side of Everything - Levitt and Dubner

I've seen this book practically everywhere. I've heard praises sung about it at length from friends. I've even been listening to the Freakonomics radio podcast for quite a while. But, it was only last month I picked up this book. The cover makes bold claims such as "...Explores The Hidden Side of Everything" along with some terms of endearment from Malcolm Gladwell. Coming to the content, the book covers a range of bizarre topics such as 'What do schoolteachers and sumo wrestlers have in common?' and 'Why Do Drug Dealers Still Live with Their Moms?'. Which is great, because the book makes convincing conclusions on these two topics. However, this cannot be said for some of the other chapters such as 'What Makes a Perfect Parent?' and how a child's name is detrimental to their future prospects. I feel the book doesn't spend enough time establishing how a certain cause leads to an effect in these chapters. The authors do include a pretty long list of citations for each fact at the end of the book (nearly as long as the content itself!), so their claims might be substantiated after all.

Russian Roulette - Anthony Horowitz

It took me forever to find this book. I used to be an avid fan of Alex Rider in school, but I lost track of the series in my dry spell of reading in 11th and 12th. Russian Roulette is something of a spin-off book, set somewhere in the middle of Stormbreaker in the Alex Rider timeline. The book is about all about Yassen, a key character in the Alex Rider series. Like most Anthony Horowitz books, it's not a particularly well-written book. Rather, it keeps one engaged with a fast (but somewhat predictable) plot. I liked some of the parts from Yassen's past in Russia, though the rest read like a rehash of Scorpia. Still, I finished this book much faster than I usually take to finish a book of its length.

Bird by Bird - Anne Lamott

In this book, Anne Lammot speaks about the joys and frustrations in the solitary world of a writer. From the outset, I loved this book. Lammot's writing style is hilarious and is loaded with dry wit and sarcasm. As with The Checklist Manifesto, the book follows on tangents with no central plot to speak of. On my first read of Bird by Bird, I couldn't help feeling that this book would read better as a blog than a novel. Though it of a much higher quality than a typical blog, the rant filled content would slide in perfectly for this format.

Lammot also peppers the book with some very quotable passages. I won't spoil it here, but it's a must read for anyone into the publishing industry or writing business.

]]>