Planet dgplug

February 13, 2019

Kushal Das

Tracking my phone's silent connections

My phone has more friends than me. It talks to more peers (computers) than the number of human beings I talk on an average. In this age of smartphones and mobile apps for A-Z things, we are dependent on these technologies. However, at the same time, we don’t know much of what is going on in the computers equipped with powerful cameras, GPS device, microphone we are carrying all the time. All these apps are talking to their respective servers (or can we call them masters?), but, there is no easy way to track them.

These questions bothered me for a long time: I wanted to see the servers my phone is connecting to, and I want to block those connections as I wish. However, I never managed to work on this. A few weeks ago, I finally sat down to start working to build up a system by reusing already available open source projects and tools to create the system, which will allow me to track what my phone is doing. Maybe not in full details, but, at least shed some light on the network traffic from the phone.

Initial trial

I tried to create a wifi hotspot at home using a Raspberry Pi and then started capturing all the packets from the device using standard tools (dumpcap) and later reading through the logs using Wireshark. This procedure meant that I could only capture when I am connected to the network at home. What about when I am not at home?

Next round

This time I took a bit different approach. I chose algo to create a VPN server. Using WireGuard, it became straightforward to connect my iPhone to the VPN. This process also allows capturing all the traffic from the phone very easily on the VPN server. A few days in the experiment, Kashmir started posting her experiment named Life Without the Tech Giants, where she started blocking all the services from 5 big technology companies. With her help, I contacted Dhruv Mehrotra, who is a technologist behind the story. After talking to him, I felt that I am going in the right direction. He already posted details on how they did the blocking, and you can try that at home :)

Looking at the data after 1 week

After capturing the data for the first week, I moved the captured pcap files into my computer. Wrote some Python code to put the data into a SQLite database, enabling me to query the data much faster.

Domain Name System (DNS) data

The Domain Name System (DNS) is a decentralized system which helps to translate the human memory safe domain names (like into Internet Protocol (IP) addresses (like ). Computers talk to each other using these IP addresses, we, don’t have to worry to remember so many names. When the developers develop their applications for the phone, they generally use those domain names to specify where the app should connect.

If I plot all the different domains (including any subdomain) which got queried at least 10 times in a week, we see the following graph.

The first thing to notice is how the phone is trying to find servers from Apple, which makes sense as this is an iPhone. I use the mobile Twitter app a lot, so we also see many queries related to Twitter. Lookout is a special mention there, it was suggested to me by my friends who understand these technologies and security better than me. The 3rd position is taken by Google, though sometimes I watch Youtube videos, but, the phone queried for many other Google domains.

There are also many queries to Akamai CDN service, and I could not find any easy way to identify those hosts, the same with Amazon AWS related hosts. If you know any better way, please drop me a note.

You can see a lot of data analytics related companies were also queried. is a major one, and thankfully algo already blocked that domain in the DNS level. I don’t know which app is trying to connect to which all servers, I found about a few of the apps in my phone by searching about the client list of the above-mentioned analytics companies. Next, in coming months, I will start blocking those hosts/domains one by one and see which all apps stop working.

Looking at data flow

The number of DNS queries is an easy start, but, next I wanted to learn more about the actual servers my phone is talking to. The paranoid part inside of me was pushing for discovering these servers.

If we put all of the major companies the phone is talking to, we get the following graph.

Apple is leading the chart by taking 44% of all the connections, and the number is 495225 times. Twitter is in the second place, and Edgecastcdn is in the third. My phone talked to Google servers 67344 number of times, which is like 7 times less than the number of times Apple itself.

In the next graph, I removed the big players (including Google and Amazon). Then, I can see that analytics companies like and have 31% of the connections, which is a lot. Most probably I will start with blocking these two first. The 3 other CDN companies, Akamai, Cloudfront, and Cloudflare has 8%, 7%, and 6% respectively. Do I know what all things are these companies tracking? Nope, and that is scary enough that one of my friend commented “It makes me think about throwing my phone in the garbage.”

What about encrypted vs unencrypted traffic? What all protocols are being used? I tried to find the answer for the first question, and the answer looks like the following graph. Maybe the number will come down if I try to refine the query and add other parameters, that is a future task.

What next?

As I said earlier, I am working on creating a set of tools, which then can be deployed on the VPN server, that will provide a user-friendly way to monitor, and block/unblock traffic from their phone. The major part of the work is to make sure that the whole thing is easy to deploy, and can be used by someone with less technical knowledge.

How can you help?

The biggest thing we need is the knowledge of “How to analyze the data we are capturing?”. It is one thing to make reports for personal user, but, trying to help others is an entirely different game altogether. We will, of course, need all sorts of contributions to the project. Before anything else, we will have to join the random code we have, into a proper project structure. Keep following this blog for more updates and details about the project.

Note to self

Do not try to read data after midnight, or else I will again think a local address as some random dynamic address in Bangkok and freak out (thank you reverse-dns).

by Kushal Das at February 13, 2019 02:47 AM

February 12, 2019

Jason Braganza (Work)

Thank you, Kushal!

Began reading The Warren Buffett Shareholder today.
This is from the preface.

Many contributors to this book remark upon Buffetts’s distinctive teaching style, which tends to instruct people how to think rather than what to think.


John Bogle has attended one Meeting, but attests that even one can change your world.

A couple of pages later

Our premise was that Berkshire’s intrinsic value owes a lot to the Meeting and the shareholder community.
Buffett wrote in his 2014 letter …

… This culture grows stronger every year, and it will remain intact long after Charlie and I have left the scene.

Berkshire Hathaway has created a culture of intelligence, inquisitivness, integrity and learning. This culture is part of the “company” in both the corporate meaning of that word and in its sense as a society of people coming together (com) to break bread (pan).

Replace Warren with Kushal, Berkshire with the DGPLUG IRC channel and the shareholder meeting with the Summer Training.
And, nothing changes.

Amidst all the shouting and the craziness, that is the channel generally, it all goes up a hundredfold when the training happens.
Tempers flare. The kids are unruly. Mayhem ensues.

Yet, it all settles down soon enough.
Folks learn earnestly.
Wisdom is shared.
Bonds are made. Friendships built.
Across time and space.

And the Atlas who holds this little world on his shoulders is Kushal.
It is he, who literally, wrote the book on what we learn.
It is he, who pays for and maintains much of the infrastructure we need.
It is he, who conducts quite a few of the topics we learn.
It is he, who bribes, and cajoles old mages to come share their wisdom, with callow, inexperienced youth.
And it is he, who keeps this little corner of the world, warm and cozy and friendly, year after year after year.

The number of folks who owe their careers to him are many.
And the folks who have their lives changed by the training, many more still.
I don’t remember if I ever said this to him, but he has given more to humanity in ten years, what others haven’t in their entire lives.
And somewhere in the middle of the chapter, I found what succintly summarises the way I feel about the Summer Training.

And amid their decades of lessons, they get to the core message of all shareholders at the Berkshire Annual Meeting: if you’ve never been, go; if you always go, keep going.

And for everything you do, thank you Kushal!

by Mario Jason Braganza at February 12, 2019 03:43 AM

February 10, 2019

Jason Braganza (Work)

On Writing as an Important Career Skill

Jason Fried, founder of Basecamp, and prolific blogger on why writing is important and how (and why) he uses writing as a hiring filter.

First, in an interview with Chase Jarvis on youtube.

CJ: What are some other really key things that you look for, that you’ve built into your company? Just some of the ones that are maybe more important to you well yeah how do you think about it?

JF: Well a lot of it is the things we don’t want. That’s how we think about it primarily.

… We don’t want to sit in meetings all day, so we don’t have a meetings heavy culture. which means that we write a lot of things down versus say them out loud. So we write write long form and write in detailed passages, so people can absorb everything on their own time versus having a meeting to pull people off their work yeah to sit in a room together, to talk about something that has nothing to do with right now, but you’re having the meeting right now. It’s very inefficient actually, a very inefficient way of doing it, so to do that (work efficiently), to facilitate that we have to hire great writers.

So we don’t hire people who can’t write. Very, very, very, important. That’s actually the probably number one hiring criteria after like, can they do the work? Are they good at the thing? But, the next thing is can they write? And if they can’t write well, we will not hire them.

CJ: So do you do a test? A written communication test?

JF: They do the test for us essentially by submitting coverletters. we look at the cover letter first. We don’t look at the resume, don’t care about previous experience, don’t care about where they went to school, don’t care about any of that stuff. We look at the cover letter, and if they don’t have one, resume gets tossed. Alright, they have to be able to write to us, saying like why they want this job, who they are, what’s important to them why is this was it this job, and not just any job yeah or if it is any job just say that too, but like I want to be able to read it.

And you read the letter and you quickly can tell.
This person can write, this person can communicate, they can express themselves, they’re clear minded, they’re thoughtful, they’re good at nuance, are good at the subtleties that matter, that separate them from somebody else.

They know how to persuade and persuasion is super important in any line of business because you’ve gotta sell, not like sell to a customer always, but selling an idea internally, to your team, whatever it is, right, so, so, so the cover letter is fundamental for us and we’re very very careful about that.
So that’s the writing test it’s not a test, but it is.

And then on a podcast with Tim Ferris

TF: You mentioned, since we’re talking about Warren Buffett, clear thinking and clear writing, which you seem to value very highly. In doing a little bit of homework for this conversation, I’ve read, and you can tell me if this is true or still the case, but how one of your top hiring criteria is whether the person is a great writer. Whether the person can communicate well in written form. I’d love for you to say whether that is still the case or not and why that is the case.

JF: It’s definitely the case. It’s sort of been the case forever for us. It’s the case because first of all, most communication is written these days. First of all, let me step back. We’re a remote company, so especially most communication is written. If you’re going to have, if you’re a local company and you’re having meetings all the time, sometimes verbal is enough. But in most cases, people are writing more and more and more than they ever have before.

One of the most costly and inefficient things is having to repeat yourself, or answer questions about something that should’ve been clear in the first place. If you can’t communicate clearly, you’re communicating probably three or four times more frequently than you need to. That can be really inefficient and really frustrating, extremely frustrating. So, we’ve always looked at clear writing as a prerequisite for every position we have at the company because everybody is supposed to communicate with themselves, with the rest of the company, with their team. Most of it’s done via the written word.

We also, for example – by the way, the first sort of gatekeeper of it is the cover letter. When people apply for a job, if they just send a résumé or whatever, they’re out instantly. It’s not even something we look at. I always want to see a cover letter of some sort. It can just be an email, of course, with an attachment of the résumé or whatever, but I want to see how you open the conversation. How do you describe yourself? Why are you applying for this job and not just any job? You can tell very quickly if someone can explain themselves, if someone can advocate for themselves and advocate for their ideas and their position and who they are and why they should work here.

If they’re clear minded, if they’re friendly, all that stuff comes through in writing. I think if you pay enough attention to the words, you can see a lot of that. Also, for example, when I hire designers, I look at their design, but I look at their writing almost a little bit more. Whenever we hire a designer, when we get down to the last five candidates that we really think could be finalists, we hire them to do a project for us. $1,500 a week and they do a project for us so we can kind of see their actual work. But even more importantly, is we ask them to write up why they did what they did because there’s a lot of great designers out there. But people have to advocate for themselves. I want to see why they did what they did and I want to read their point of view and their line of thinking and how they came up with the solutions.

When I read that, it helps me understand what would it be like to work with that person for real. Are they able to explain why they did what they did? Are they glossing over little details that actually matter? What is it? How do they see their work and how do they write about it? It’s a very important part of the job here at Basecamp. So, yeah, writing’s important in every position. It doesn’t matter what position you’re in, you still have to be a great writer. We typically start by looking at the cover letter. The actual work assignments to get hired here typically involve writing, no matter what the position. I think it’s always proven, in our case, to be a really good indicator of someone’s success here.

Whenever we hire someone, we’re like, I don’t know, the writing wasn’t quite right. It almost always pans out that it turns out that they’re not the right fit for the company, even if their work was great. But they’re just unable to really convince people and persuade people based on a missing bit of – it’s not magic, but I’ll call it that – in the writing. Where you read something and you go, this is great.

P.S. Liked this post? Then subscribe to

by Mario Jason Braganza at February 10, 2019 03:30 PM

Kushal Das

When I was sleepy

Back in 2005 I joined my first job, in a software company in Bangalore. It was a backend of a big foreign bank. We trained heavily on different parts of software development during the first few months. At the same time, I had an altercation with the senior manager (about some Java code) who was in charge of the new joinees and their placement within the company. The result? Everyone else got a team but me, and I had to roam around within the office to find an empty seat and wait there till the actual seat owner came back. I managed to spend a lot of days in the cafeteria on the rooftop. But, then they made new rules that one can not sit there either, other than at lunch time.

So, I went asking around, talking to all the different people in the office (there were 500+ folks iirc) if they know any team who would take on a fresher. I tried to throw in words like Linux, open source to better my chances. And then one day, I heard that the research and development team was looking for someone with Linux and PHP skills. I went in to have a chat with the team, and they told me the problem (it was actually on DSpace, a Java based documentation/content repository system), and after looking at my resume decided to give me a desktop for couple of weeks. I managed to solve the problem in next few days, and after a week or so, I was told that I will join the team. There were couple of super senior managers and I was the only kid on that block. Being part of this team allowed me to explore different technologies and programming languages.

I will later write down my experiences in more detail, but for today, I want to focus on one particular incident. The kind of incident, which all system administrators experience at least once in their life (I guess). I got root access to the production server of the DSpace installation within a few weeks. I had a Windows desktop, and used putty to ssh in to the server. As this company was backend of the big bank, except for a few senior managers, no one else had access to Internet on their systems. There were 2 desktops in the kiosk in the ground floor, and one had to stand in a long queue to get a chance to access Internet.

One day I came back from the lunch (a good one), and was feeling a bit sleepy. I had taken down the tomcat server, pushed the changes to the application, and then wanted to start the server up again. Typed the whole path to (I don’t remember the actual name, I’m just guessing it was and hit Enter. I was waiting for the long screens of messages this startup script spewed as it started up, but instead, I got back the prompt quickly. I was wondering what went wrong. Then, looking at the monitor very closely, I suddenly realised that I was planning to delete some other file and I had written rm at the beginning of the command prompt, forgotten it, and then typed the path of the Suddenly I felt the place get very hot and stuffy; I started sweating and all blood drained from my face in the next few moments. I was at panic level 9. I was wondering what to do. I thought about the next steps to follow. I still had a small window of time to fix the service. Suddenly I realized that I can get a copy of the script from the Internet (yay, Open Source!). So, I picked up a pad and a pen, ran down to the ground floor, and stood in the queue to get access to a computer with Internet. After getting the seat, I started writing down the whole on the pad and double checked it. Ran right back up to my cubicle, feverishly typed in the script, (somehow miraculously without any typo in one go.) As I executed the script, I saw the familiar output, messages scrolling up, screen after joyful screen. And finally as it started up, I sighed a huge sigh of relief. And after the adrenalin levels came down, I wrote an incident report to my management, and later talked about it during a meeting.

From that day on, before doing any kind of destructive operation, I double check the command prompt for any typo. I make sure, that I don’t remove anything randomly and also make sure that I have my backups is place.

by Kushal Das at February 10, 2019 02:31 AM

February 05, 2019

Anwesha Das

Have a safer internet

Today, 5th February is the safer internet day. The primary aim of this day is to advance the safe and positive use of digital technology for children and young people. Moreover, it promotes the conversation over this issue. So let us discuss a few ideas.The digital medium is the place where we live our today. It has become our world. However, as compare to the physical world to this world and its rules are unfamiliar to us. Also, adding to that with the advent of social media we are putting our lives, every detail of it in and at the domain of social media. We are then letting governments, industrial lords, political parties, snoops, and the society to judge, to see and monitor us. We, the fragile, vulnerable us, do not have any other option but to watch our freedom, privacy vanishing.

Do we not have anything to save ourselves? Ta Da! Here are some basic ideas are the following which you can try to follow in your everyday life to keep yourself safe in the digital world.

Use unique passphrases

Use passphrases instead of passwords.Passwords are easy to break as well as easy to copy so instead of using “Frank” (a name) or “Achinpur60”(a part of your address), use passphrases like “DiscountDangerDumpster”. It is easy to remember and hard to break. You can assemble 2 more languages (it is easy for us, Indians, right?). I used diceware to generate that password. Moreover, by unique what I mean is that do not use the SAME PASSWORD EVERYWHERE. I can feel how difficult, tedious, impossible it is for you to remember all the lengthy, difficult passphrases (now not passwords remember!) for all your accounts. However, nothing can be done with this. If someone can get your passphrase for one account, he will be able to all of them. Unique passphrases help a lot in this case.

Use password managers

To solve your above-mentioned problem of remembering long passphrases you have a magic thing called password manager. Just move your wand (read mouse) once, and you can find your long passphrases safely protected in their safe vaults. There are many different password managers LastPass, KeePassXC, etc. If you want to know more about this, please read it here.

Do not leave your device (computer, phone, etc) unlocked

My 2 year old once had typed some not so kind words (thanks to autocorrect) to my in-laws and the lovely consequence it brought still shivers me. But thankfully so it was not with someone, having the good technical knowledge and not so good intention, who could cause much greater damage if unlucky then irrecoverable damage than this. So please do not leave your device unlocked.

Do not share your password or your device with anyone

The similar kinds of danger, as aforementioned it poses if you share your password with anyone.

Do block your webcam and phone’s camera

It is now a well-known fact that attackers are spying on us through our web cameras. They are deceiving users by installing webcam spyware. Many of us may think “oh we are safe, our device has indicator lights, so we will be knowing when and if there is any video recording happening.” It is very much possible to disable the activity light by changing the configurations and software hacks. So even if there is no light, your video can very well be taken.

Do not ignore security updates

Most of the time when a security update notification pops up in the morning we very smoothly ignore it for our morning dose of news or checking our social media feed. However, that is the most irresponsible thing you want to do in your day. It may be last chance to secure yourself from the future danger. Mainer times the cyber attackers take advantage of your old, outdated software and attack you through it. It may be your old PDF reader, web browser or your operating system. So, this the most primary thing to your digital security lesson 101 is to keep your software up to date.

Acquire some basic knowledge about your machine

I know (trust me I have passed the phase) please acquire some basic knowledge about your machine, eg which version of operating system you are using, the other software on your machine and their version number. If and when they require any updates or not.

Do not download random websites from the internet.

Do not download random websites from the internet they might contain malware, virus. It might not only affect your machine but all the devices in the network. So, please check the website you are downloading from.

The same caution as above goes for this also. Do not click on the random URLs you receive over email or social media sites.

Use two-factor authentication

Two-factor authentication merely is two steps of validation. It adds an extra layer of security in and for your device. In 2FA the user needs to put two passwords instead of one. It is advisable that you have your 2FA installed on your mobile phone, or even better, use a hardware token like Yubikey. So that if someone wants to hack your account, then they have to get hold of both password and the phone.

Use Tor network

Tor Project is the most trusted and proposed project to remain private, to retain your anonymity. Tor is defined as “free software and an open network that helps you defend against traffic analysis, a form of network surveillance that threatens personal freedom and privacy, confidential business activities, and relationships, and state security.” in their website. Have a look at this to know more.

Take proper legal action

If something terrible happens to you online, please visit the local cyber crime department and lodge a formal complaint there. The local police stations do not deal with the matter related to cyber crimes. So you might directly want to go to the appropriate cyber security cell. If you do not have any idea that where is it, what to do there etc. You go to your Local Police Station take their advice, the information you need and then go the cyber security cell.

Learn GPG encryption

It is always suggested to know and learn GPG, Gnu Privacy Guard if you are up to that level of learning technical things. It is a bit difficult, but, surely a very useful tool to keep your privacy secured.

The steps I mentioned above may sound "too much" to maintain. But let us pretend that your house is your device and password is key to enter there. You normally follow all possible way to keep your house keys safe so the same rules apply here also. The rules are nothing but an habit,like getting up in the morning, it seems difficult for first few times but after that it is organic and normal as it can be. So, build the habbit of keeping safe, only using these tools will not offer you the desired results you need.

Hope you have a happy, safe life in the digital world.

by Anwesha Das at February 05, 2019 05:40 PM

February 04, 2019

Sayan Chowdhury

Editing Git commits from history

Editing Git commits from history

Let's say you've raised a Pull Request on GitHub with 3 commits as shown below.

d46d2f8a (HEAD -> feature, devel/feature) Add Fedora system to handle the Fedora Cloud images
f726b033 vendor: Add new dep
dbdbda67 Split release and pre-release to handle multiple systems

Here, d46d2f8a is the HEAD, followed by f726b033 and dbdbda67.

Your reviewers go through the commits, and suggest changes to some files. The suggestions now need to be applied to the commit dbdbda67. How would you fix this? This is a jam I usually land myself in, as for review I prefer my commits to be concise, shows incremental changes and for each commit to be atomic.

There are two ways you can use to fix this issue:

Rebase and Edit

I start off with git rebase -i which opens up the list of commits up for  rebase. In our example here, the third commit from the HEAD needs to be changed, so I'll rebase commits till HEAD~3.

git rebase -i HEAD~3

This opens up the editor, with the options

pick dbdbda67 Split release and pre-release to handle multiple systems
pick f726b033 vendor: Add new dep
pick d46d2f8a Add Fedora system to handle the Fedora Cloud images

Below this, you will find a list of commands you can use with git-rebase:

# p, pick <commit> = use commit
# r, reword <commit> = use commit, but edit the commit message
# e, edit <commit> = use commit, but stop for amending
# s, squash <commit> = use commit, but meld into previous commit
# f, fixup <commit> = like "squash", but discard this commit's log message
# x, exec <command> = run command (the rest of the line) using shell
# d, drop <commit> = remove commit
# l, label <label> = label current HEAD with a name
# t, reset <label> = reset HEAD to a label
# m, merge [-C <commit> | -c <commit>] <label> [# <oneline>]
# .       create a merge commit using the original merge commit's
# .       message (or the oneline, if no original merge commit was
# .       specified). Use -c <commit> to reword the commit message.

The list is quite coherent, but for this post, we'll use the edit command. So replace pick with edit  against the commit you would like to edit and exit the editor.

So for our example, it'll become (note the edit command in the first line):

edit dbdbda67 Split release and pre-release to handle multiple systems
pick f726b033 vendor: Add new dep
pick d46d2f8a Add Fedora system to handle the Fedora Cloud images

# Rebase 51c8dc75..d46d2f8a onto 51c8dc75 (3 commands)
# Commands:
# p, pick <commit> = use commit
# r, reword <commit> = use commit, but edit the commit message
# e, edit <commit> = use commit, but stop for amending
# s, squash <commit> = use commit, but meld into previous commit
# f, fixup <commit> = like "squash", but discard this commit's log message
# x, exec <command> = run command (the rest of the line) using shell
# d, drop <commit> = remove commit
# l, label <label> = label current HEAD with a name
# t, reset <label> = reset HEAD to a label
# m, merge [-C <commit> | -c <commit>] <label> [# <oneline>]
# .       create a merge commit using the original merge commit's
# .       message (or the oneline, if no original merge commit was
# .       specified). Use -c <commit> to reword the commit message.
# These lines can be re-ordered; they are executed from top to bottom.
# If you remove a line here THAT COMMIT WILL BE LOST.
#       However, if you remove everything, the rebase will be aborted.
# Note that empty commits are commented out

What this will do is stop the rebase process at the commit you wanted to edit. At this moment, you can go ahead and perform your changes. Add files, modify them, remove them etc.

Once you're done, add the files using git add and continue the rebase using:

git rebase --continue

The other way? Rebase and Fixup

I've recently started to use this method a whole lot more than the previous one I described. Here, you can just go ahead and do the changes.

Once done, pass the argument --fixup, while making the commit. For our example,

git commit --fixup=dbdbda67

This would create a fixup commit starting with !fixup. Next, go ahead and rebase the commits from the commit to be fixed to fixup commit.

git rebase -i --autosquash dbdbda67~1

This will merge the fixup commit with the commit to be fixed.

Voila! Serve the PR while hot. 🍜

Do you know easier methods to tackle this issue? Drop me a message on Twitter @yudocaa. I would like to thank Jason, Jaysinh, and Sayani who did proofreading for this blog post. Thanks to @anniespratt for the cover image.

by Sayan Chowdhury at February 04, 2019 05:29 PM

February 03, 2019

Jason Braganza (Personal)

When Death Comes

I want this glorious verse from Mary Oliver’s poem to be my eulogy, when, you know, my death comes.

When it’s over, I want to say: all my life
I was a bride married to amazement.
I was the bridegroom, taking the world into my arms.

When it’s over, I don’t want to wonder
if I have made of my life something particular, and real.
I don’t want to find myself sighing and frightened,
or full of argument.

I don’t want to end up simply having visited this world.

— via Austin Kleon’s touching eulogy to Mary Oliver.

P.S. Also love her instructions for living a life.

  • Pay attention.
  • Be astonished.
  • Tell about it.

P.P.S If you enjoy reading what I write and share, go subscribe.

by Mario Jason Braganza at February 03, 2019 02:30 PM

January 28, 2019

Jason Braganza (Personal)

Supernova in the East

If you haven’t already heard me raving about Hardcore History and Dan Carlin then you’re about to :)

Hardcore History is the world’s slowest podcast. The Accidental Tech Podcast, a topical weekly Apple news podcast that i listen to, started in 2013 and as of today, 16th January, 2019 is now on episode 308. Hardcore History, on the other hand, began its run in 2005 and is now on episode 65. I just checked the feed and Dan averages a measly two episodes a year.

In truth however, it makes very little sense to look at them as podcast episodes. Think of them as books. Medium length audiobooks. And then it suddenly makes sense. A book a year. An engaging history book, a year. For free!

Not that you’ll want to just stick to free anyhoo. Dan is superengaging and like so many folks say he makes history come alive.

After a few episodes, you’ll be begging to give him your money. The man is that good. And if you are so inclined, the entire back catalogue is available for purchase. My favourite is the Wrath of the Khans. 1

And why all this raving now? Because it’s time for “Supernova in the East II”, the first Hardcore History episode of 2019. Find Supernova in the East I here.

From the episode’s description:

The Asia-Pacific War of 1937-1945 has deep roots. It also involves a Japanese society that’s been called one of the most distinctive on Earth. If there were a Japanese version of Captain America, this would be his origin story.

Intrigued? Then go listen. And subscribe to all future episodes in your podcast player of choice, using this link.

You can thank me later.

P.S. The pic below is a glimpse of the research that goes into one, single episode.

  1. This link goes to a seperate compilation download just for this series. 

by Mario Jason Braganza at January 28, 2019 12:30 AM

January 23, 2019

Sayan Chowdhury

Vim, Wish I knew this about you before!

Vim, Wish I knew this about you before!

Almost all my life since I have started working around open source projects I have been a Vim  user. That does not mean that I did not look into other options. I've tried my hands on Emacs, Sublime Text, Visual Studio Code, Atom, et cetera.  But, none of them pleased me like vim.

2018, I started working on Golang, and vim had to adapt to my needs, but it did not quite succeed and I made my switch to neovim around the end of the year. Around the same time, I took a resolution to dive deep into neovim to increase my productivity.

Jump Lists

In golang, I heavily use a :GoDef which is part of the vim-go plugin to go to a specific symbol or declaration.

Vim keeps track of all the jumps (previously visited cursor positions). :jumps lists down all the performed jumps for the current window. Ctrl+O & Ctrl+I helps you to cycle through the jumps. But, what counts as a jump?

Vim, Wish I knew this about you before!

Any of the actions mentioned on the above list counts as a jump, and makes an entry into the jump list. You can clear the jump list using :clearjumps.

Vim, Wish I knew this about you before!

The columns being jump, line, column and file/text. Given the above:

  • Ctrl-I to jump to line 415 in the current buffer.
  • Ctrl-O to jump to line 358 in the current buffer.
  • 3 then Ctrl-O to jump to line 364 in current buffer.
  • 5 then Ctrl-I to jump to line 395 in the mantle/cmd/plume/prerelease.go.

I'll keep on updating the posts as I learn more about vim/neovim. Till then, saraba da!

Reference for this post

by Sayan Chowdhury at January 23, 2019 05:45 PM

January 12, 2019

Jaysinh Shukla

Python 3.7 feature walkthrough

In this post, I will explain improvements done in Core Python version 3.7. Below is the outline of features covered in this post.

  • Breakpoints

  • Subprocess

  • Dataclass

  • Namedtuples

  • Hash-based Python object file


Breakpoint is an extremely important tool for debugging. Since I started learning Python, I am using the same API for putting breakpoints. With this release, breakpoint() is introduced as a built-in function. Because it is in a built-in scope, you don’t have to import it from any module. You can call this function to put breakpoints in your code. This approach is handier than importing pdb.set_trace().

Breakpoint function in Python 3.7

Code used in above example

for i in range(100):
    if i == 10:


There wasn’t any handy option to disable or enable existing breakpoints with a single flag. But with this release, you can certainly reduce your pain by using PYTHONBREAKPOINT environment variable. You can disable all breakpoints in your code by setting the environment variable PYTHONBREAKPOINT to 0.

Breakpoint environment variable in Python 3.7

I advise putting “PYTHONBREAKPOINT=0” in your production environment to avoid unwanted pausing at forgotten breakpoints

You can pipe the output of Standard Output Stream (stdout) and Standard Error Stream (stderr) by enabling capture_output parameter of function. got capture_output parameter

You should note that it is an improvement over piping the stream manually. For example,["ls", "-l", "/var"], stdout=subprocess.PIPE, stderr=subprocess.PIPE) was the previous approach to capture the output of stdout and stderr.


The new class level decorator @dataclass introduced with the dataclasses module. Python is well-known for achieving more by writing less. It seems that this module will receive more updates in future which can be applied to reduce significant line of code. Basic understanding of Typehints is expected to understand this feature.

When you wrap your class with the @dataclass decorator, the decorator will put obvious constructor code for you. Additionally, it defines a behaviour for dander methods __repr__(), __eq__() and __hash__().


Below is the code before introducing a dataclasses.dataclass decorator.

class Point:

    def __init__(self, x, y):
        self.x = x
        self.y = y

After wrapping with @dataclass decorator it reduces to below code

from dataclasses import dataclass

class Point:
    x: float
    y: float


The namedtuples are a very helpful data structure, yet I found it is less known amongst developers. With this release, you can set default values to argument variables.

Namedtuples with default arguments

Note: Default arguments will be assigned from left to right. In the above example, default value 2 will be assigned to variable y

Below is the code used in the example

from collections import namedtuple

Point = namedtuple("Point", ["x", "y"], defaults=[2,])
p = Point(1)


.pyc are object files generated everytime you change your code file (.py). It is a collection of meta-data created by an interpreter for an executed code. The interpreter will use this data when you re-execute this code next time. Present approach to identify an outdated object file is done by comparing meta fields of source code file like last edited date. With this release, that identification process is improved by comparing files using a hash-based approach. The hash-based approach is quick and consistent across various platforms than comparing last edited dates. This improvement is considered unstable. Core python will continue with the metadata approach and slowly migrate to the hash-based approach.


  • Calling breakpoint() will put a breakpoint in your code.

  • Disable all breakpoints in your code by setting an environment variable PYTHONBREAKPOINT=0.

  •[...], capture_output=True) will capture the output of stdout and stderr.

  • Class level decorator @dataclass will define default logic for constructor function. It will implement default logic for dunder methods __repr__(), ___eq__() and __hash__().

  • Namedtuple data structure supports default values to its arguments using defaults.

  • Outdated Python object files (.pyc) are compared using the hash-based approach.

I hope you were able to learn something new by reading this post. If you want to read an in-depth discussion on each feature introduced in Python 3.7, then please read this official post. Happy hacking!

Proofreaders: Jason Braganza, Ninpo, basen_ from #python at Freenode, Ultron from #python-offtopic at Freenode, up|ime from ##English at Freenode

by Jaysinh Shukla at January 12, 2019 07:53 PM

December 26, 2018

Shakthi Kannan

Ansible deployment of Jenkins

[Published in Open Source For You (OSFY) magazine, August 2017 edition.]


In this sixth article in the DevOps series, we will install Jenkins using Ansible and set up a Continuous Integration (CI) build for a project that uses Git. Jenkins is Free and Open Source automation server software that is used to build, deploy and automate projects. It is written in Java and released under the MIT license. A number of plugins are available to integrate Jenkins with other tools such as version control systems, APIs and databases.

Setting it up

A CentOS 6.8 Virtual Machine (VM) running on KVM will be used for the installation. Internet access should be available from the guest machine. The Ansible version used on the host (Parabola GNU/Linux-libre x86_64) is The ansible/ folder contains the following files:


The IP address of the guest CentOS 6.8 VM is added to the inventory file as shown below:

jenkins ansible_host= ansible_connection=ssh ansible_user=root ansible_password=password

An entry for the jenkins host is also added to the /etc/hosts file as indicated below: jenkins


The playbook to install the Jenkins server on the CentOS VM is given below:

- name: Install Jenkins software
  hosts: jenkins
  gather_facts: true
  become: yes
  become_method: sudo
  tags: [jenkins]

    - name: Update the software package repository
        name: '*'
        update_cache: yes

    - name: Install dependencies
        name: "{{ item }}"
        state: latest
        - java-1.8.0-openjdk
        - git
        - texlive-latex
        - wget

    - name: Download jenkins repo
      command: wget -O /etc/yum.repos.d/jenkins.repo

    - name: Import Jenkins CI key
        state: present

    - name: Install Jenkins
        name: "{{ item }}"
        state: latest
        - jenkins

    - name: Allow port 8080
      shell: iptables -I INPUT -p tcp --dport 8080 -m state --state NEW,ESTABLISHED -j ACCEPT

    - name: Start the server
        name: jenkins
        state: started

    - wait_for:
        port: 8080

The playbook first updates the Yum repository and installs the Java OpenJDK software dependency required for Jenkins. The Git and Tex Live LaTeX packages are required to build our project, (now at We then download the Jenkins repository file, and import the repository GPG key. The Jenkins server is then installed, port 8080 is allowed through the firewall, and the script waits for the server to listen on port 8080. The above playbook can be invoked using the following command:

$ ansible-playbook -i inventory/kvm/inventory playbooks/configuration/jenkins.yml -vv


You can now open in the browser on the host to start configuring Jenkins. The web page will prompt you to enter the initial Administrator password from /var/lib/jenkins/secrets/initialAdminPassword to proceed further. This is shown in Figure 1:

Unlock Jenkins

The second step is to install plugins. For this demonstration, you can select the “Install suggested plugins” option, and later install any of the plugins that you require. Figure 2 displays the selected option:

Customize Jenkins

After you select the “Install suggested plugins” option, the plugins will get installed as shown in Figure 3:

Getting Started

An admin user is required for managing Jenkins. After installing the plugins, a form is shown for you to enter the user name, password, name and e-mail address of the administrator. A screenshot of this is shown in Figure 4:

Create First Admin User

Once the administrator credentials are stored, a “Jenkins is ready!” page will be displayed, as depicted in Figure 5:

Jenkins is ready!

You can now click on the “Start using Jenkins” button to open the default Jenkins dashboard shown in Figure 6:

Jenkins Dashboard

An example of a new project

Let’s now create a new build for the project. Provide a name in the “Enter an item name” text box and select the “Freestyle project”. Figure 7 provides shows the screenshot for creating a new project:

Enter an item name

The next step is to add the GitHub repo to the “Repositories” section. The GitHub https URL is provided as we are not going to use any credentials in this example. By default, the master branch will be built. The form to enter the GitHub URL is shown in Figure 8:

Add GitHub repo

A Makefile is available in the project source code, and hence we can simply invoke “make” to build the project. The “Execute shell” option is chosen in the “Build” step, and the “make clean; make” command is added to the build step as shown in Figure 9.

Build step

From the left panel, you can click on the “Build Now” link for the project to trigger a build. After a successful build, you should see a screenshot similar to Figure 10.

Build success


An uninstall script to remove the Jenkins server is available in playbooks/admin folder. It is given below for reference:

- name: Uninstall Jenkins
  hosts: jenkins
  gather_facts: true
  become: yes
  become_method: sudo
  tags: [remove]

    - name: Stop Jenkins server
        name: jenkins
        state: stopped

    - name: Uninstall packages
        name: "{{ item }}"
        state: absent
        - jenkins

The script can be invoked as follows:

$ ansible-playbook -i inventory/kvm/inventory playbooks/admin/uninstall-jenkins.yml

December 26, 2018 01:00 PM

November 27, 2018

Anwesha Das

Upgraded my blog to Ghost 2.6

I have been maintaining my blog. It is a self hosted Ghost blog, where I have my theme as Casper, the Ghost default. In the recent past, September 2018, Ghost has updated its version to 2.0. Now it is my time to update mine.

It is always advisable to test it before running it into production server. I maintain a stage instance for the same. I test any and all the changes there before touching the production server. I did the same thing here also.

I have exported Ghost data into a Json file. For the ease to read I have prettified the file. I removed the old database and started the container for the new Ghost. I reimported the data into the new Ghost using the json file.

I had another problem to solve, the theme. I used to have Casper as my theme. But the new look of it, is something I do not like for my blog, which is predominantly a text blog. I was unable to fix the same theme for the new Ghost. Therefore I chose to use Attila as my theme. I did some modifications, uploaded and enabled it for my blog. A huge gratitude to the Ghost community and the developers, it was a real smooth job.

by Anwesha Das at November 27, 2018 02:57 PM

November 10, 2018

Robin Schubert

Truth vs. Theory

The dumbest thing you can do is to think you're smart.

We often tend to think we know a lot of things. Things we read, hear or see on whatever source of information may be perceived as just true. However, I think that it is very important to question even the most trivial of best known things. The believe in knowledge does not just kill creativity but can also be dangerous.

I've studied Physics and there is one take-home message that I would like to share. People often hear that physicists have discovered this or that. In most cases this leads to the believe that we know how the world around us works and what it is made of. Actually we don't. The way Physics works is different: It won't tell you how things work or what they are made of, instead it will provide you with a set of tools, models, and theories, derived from observations and previous models and theories, that will often result in pretty good approximations, and predictions of what we're observing in the world around us. This is not better or worse than the truth would be, in fact it's a very pure and straight forward approach that allows us to go far beyond of what seems possible sometimes.

It would in fact be quite optimistic to think that we could understand truth, with the limitations of our nature. We perceive the world in three dimensions, are heavily dependent on language (could write a whole book on that) and have a limited set of senses - but what is worse: we're not even using them. We rely on science and studies instead, loosing more and more the ability to perceive and interpret (and believe in) the signals of our own body. You cannot convince someone that might call himself scientist who knows how thinks work of the efficacy of some compound when you just feel that it is good and right for you. Instead, the compound has to go through several stages of clinical trials, that try to measure safety, tolerability and efficacy in vitro, in animals, in humans. While I understand and appreciate this approach, I often feel like the available tools to assess these domains are not even close to be suitable for that task. As a result, a negative trial will let us know that there is no effect.

It's neither easy nor fun to discuss with someone who is fiercely convinced by something just read in an article. While it's a very good thing to read (or to gather information through other channels), that information should not just be taken for granted because it has been printed in a journal. To question that information at least every once in a while should be a habit.

by Robin Schubert at November 10, 2018 12:00 AM

October 29, 2018

Anu Kumari Gupta (ann)

Enjoy octobers with Hacktoberfest

I know what you are going to do this October. Scratching your head already? No, don’t do it because I will be explaining you in details all that you can do to make this october a remarkable one, by participating in Hacktoberfest.

Guessing what is the buzz of Hacktoberfest all around? 🤔

Hacktoberfest is like a festival celebrated by people of open source community, that runs throughout the month. It is the celebration of open source software, and welcomes everyone irrespective of the knowledge they have of open source to participate and make their contribution.

  • Hacktoberfest is open to everyone in our global community!
  • Five quality pull requests must be submitted to public GitHub repositories.
  • You can sign up anytime between October 1 and October 31.

<<<<Oh NO! STOP! Hacktoberfest site defines it all. Enough! Get me to the point.>>>>

Already had enough of the rules and regulations and still wondering what is it all about, why to do and how to get started? Welcome to the right place. This hacktoberfest is centering a lot around open source. What is it? Get your answer.

What is open source?

If you are stuck in the name of open source itself, don’t worry, it’s nothing other than the phrase ‘open source’ mean. Open source refers to the availability of source code of a project, work, software, etc to everyone so that others can see, modify changes to it that can be beneficial to the project, share it, download it for use. The main aim of doing so is to maintain transparency, collaborative participation, the overall development and maintenance of the work and it is highly used for its re-distributive nature. With open source, you can organize events and schedule your plans and host it onto an open source platform as well. And the changes that you make into other’s work is termed as contribution. The contribution do not necessarily have to be the core code. It can be anything you like- designing, organizing, documentation, projects of your liking, etc.

Why should I participate?

The reason you should is you get to learn, grow, and eventually develop skills. When you make your work public, it becomes helpful to you because others analyze your work and give you valuable feedback through comments and letting you know through issues. The kind of work you do makes you recognized among others. By participating in an active contribution, you also find mentors who can guide you through the project, that helps you in the long run.

And did I tell you, you get T-shirts for contributing? Hacktoberfest allows you to win a T-shirt by making at least 5 contributions. Maybe this is motivating enough to start, right? 😛 Time to enter into Open Source World.

How to enter into the open source world?

All you need is “Git” and understanding of how to use it. If you are a beginner and don’t know how to start or have difficulty in starting off, refer this “Hello Git” before moving further. The article shows the basic understanding of Git and how to push your code through Git to make it available to everyone. Understanding is much more essential, so take your time in going through it and understanding the concept. If you are good to go, you are now ready to make contribution to other’s work.

Steps to contribute:

Step 1; You should have a github account.

Refer to the post “Hello Git“, if you have not already. The idea there is the basic understanding of git workflow and creating your first repository (your own piece of work).

Step 2: Choose a project.

I know choosing a project is a bit confusing. It seems overwhelming at first, but trust me once you get the insights of working, you will feel proud of yourself. If you are a beginner, I would recommend you to first understand the process by making small changes like correcting mistakes in a README file or adding your name to the contributors list. As I already mention, not every contributions are into coding. Select whatever you like and you feel that you can make changes, which will improve the current piece of work.

There are numerous beginner friendly as well as cool projects that you will see labelled as hacktoberfest. Pick one of your choice. Once you are done with selecting a project, get into the project and follow the rest.

Step 3: Fork the project.

You will come across several similar posts where they will give instructions to you and what you need to perform to get to the objective, but most important is that you understand what you are doing and why you are doing. Here am I, to explain you, why exactly you need to perform these commands and what does these terms mean.

Fork means to create a copy of someone else’s repository and add it to your own github account. By forking, you are making a copy of the forked project for yourself to make changes into it. The reason why we are doing so, is that you would not might like to make changes to the main repository. The changes you make has to be with you until you finalize it to commit and let the owner of the project know about it.

You must be able to see the fork option somewhere at the top right.


Do you see the number beside it. These are the number of forks done to this repository. Click on the fork option and you see it forking as:

Screenshot from 2018-10-29 22-45-09

Notice the change in the URL. You will see it is added in your account. Now you have the copy of the project.

Step 4: Clone the repository

What cloning is? It is actually downloading the repository so that you make it available in your desktop to make changes. Now that you have the project in hand, you are ready to amend changes that you feel necessary. It is now on your desktop and you know how to edit with the help of necessary tools and application on your desktop.

“clone or download” written in green button shows you a link and another option to directly download.

If you have git installed on your machine, you can perform commands to clone it as:

git clone "copied url"

copied url is the url shown available to you for copying it.

Step 5: Create a branch.

Branching is like the several directory you have in your computer. Each branch has the different version of the changes you make. It is essential because you will be able to track the changes you made by creating branches.

To perform operation in your machine, all you need is change to the repository directory on your computer.

 cd  <project name>

Now create a branch using the git checkout command:

git checkout -b 

Branch name is the name given by you. It can be any name of your choice, but relatable.

Step 6: Make changes and commit

If you list all the files and subdirectories with the help of ls command, your next step is to find the file or directory in which you have to make the changes and do the necessary changes. For example. if you have to update the README file, you will need an editor to open the file and write onto it. After you are done updating, you are ready for the next step.

Step 7: Push changes

Now you would want these changes to be uploaded to the place from where it came. So, the phrase that is used is that you “push changes”. It is done because after the work i.e., the improvements to the project, you will be willing to let it be known to the owner or the creator of the project.

so to push changes, you perform as follows:

git push origin 

You can reference the URL easily (by default its origin). You can alternatively use any shortname in place of origin, but you have to use the same in the next step as well.

Step 8: Create a pull request

If you go to the repository on Github, you will see information about your updates and beside that you will see “Compare and pull request” option. This is the request made to the creator of the main project to look into your changes and merge it into the main project, if that is something the owner allows and wants to have. The owner of the project sees the changes you make and do the necessary patches as he/she feels right.

And you are done. Congratulations! 🎉

Not only this, you are always welcome to go through the issues list of a project and try to solve the problem, first by commenting and letting everyone know whatever idea you have to  solve the issue and once you are approved of the idea, you make contributions as above. You can make a pull request and reference it to the issue that you solved.

But, But, But… Why don’t you make your own issues on a working project and add a label of Hacktoberfest for others to solve?  You will amazed by the participation. You are the admin of your project. People will create issues and pull requests and you have to review them and merge them to your main project. Try it out!

I  hope you find it useful and you enjoyed doing it.

Happy Learning!

by anuGupta at October 29, 2018 08:20 PM

October 22, 2018

Sanyam Khurana

Event Report - DjangoCon US

If you've already read about my journey to PyCon AU, you're aware that I was working on a Chinese app. I got one more month to work on the Chinese app after PyCon AU, which meant improving my talk to have more things such as passing the locale info in async tasks, switching language in templates, supporting multiple languages in templates etc.

I presented the second version of the talk at DjangoCon US. The very first people I got to see again, as soon as I entered DjangoCon US venue were Russell and Katie from Australia. I was pretty much jet-lagged as my International flight got delayed by 10 hours, but I tried my best to deliver the talk.

Here is the recording of the talk:

You can see the slides of my talk below or by clicking here:

After the conference, we also had a DSF meet and greet, where I met Frank, Rebecca, Jeff, and a few others. Everyone was so encouraging and we had a pretty good discussion around Django communities. I also met Carlton Gibson, who recently became a DSF Fellow and also gave a really good talk at DjangoCon on Your web framework needs you!.

Carol, Jeff, and Carlton encouraged me to start contributing to Django, so I was waiting eagerly for the sprints.

DjangoCon US with Mariatta Wijaya, Carol Willing, Carlton Gibson

Unfortunately, Carlton wasn't there during the sprints, but Andrew Pinkham was kind enough to help me with setting up the codebase. We were unable to run the test suite successfully and tried to debug that, later we agreed to use django-box for setting things up. I contributed few PRs to Django and was also able to address reviews on my CPython patches. During the sprints, I also had a discussion with Rebecca and we listed down some points on how we can lower the barrier for new contributions in Django and bring in more contributors.

I also published a report of my two days sprinting on Twitter:

DjangoCon US contributions report by Sanyam Khurana (CuriousLearner)

I also met Andrew Godwin & James Bennett. If you haven't yet seen the Django in-depth talk by James I highly recommend you to watch that. It gave me a lot of understanding on how things are happening under the hood in Django.

It was a great experience altogether being an attendee, speaker, and volunteer at DjangoCon. It was really a very rewarding journey for me.

There are tons of things we can improve in PyCon India, taking inspiration from conferences like DjangoCon US which I hope to help implement in further editions of the conference.

Here is a group picture of everyone at DjangoCon US. Credits to Bartek for the amazing click.

DjangoCon US group picture

I want to thank all the volunteers, speakers and attendees for an awesome experience and making DjangoCon a lot of fun!

by Sanyam Khurana at October 22, 2018 06:57 AM