Why do what we do in life matter?

I just turned 23 years old. The most pressing question about this time in life is what you want to do in it. Follow some passion? Make an impact on something? Earn lots of money and chill? Almost every time I sit trying to answer the question, I always end up wondering ‘In the end does it even matter?’

Paradox of free will/Determinism
Growing up, most of what we study of science in school is deterministic. In other words, we study the universe how it works and try to formulate it in mathematical equations which then we use to predict the future. This gives rise to philosophical idea of Determinism, i.e., given complete knowledge of the state of each and every particle in the universe at a point in time and how they react with each other, one can precisely predict the state of universe at any time in the future or past. The time along with free will loses its meaning.  Because as the future is already determined, it shouldn’t really matter on what you do right now; or even what you do right now isn’t dependent on what you think; or even what you think right now isn’t really in your control and so on. The whole concept of individualism and making an impact, making huge life changing decisions seems moot. Is that it? What we do in life is not really a choice and that’s why it doesn’t really make sense to sit and think about it? Well not that fast.

Quantum Mechanics – Hope
The well known Schrodinger’s cat comes to the rescue. For those who don’t know what a  Schrodinger’s cat is, its a cat in a closed box which is simultaneously dead and alive until we open the box to see its state, i.e, it is not deterministic. In other words, if you repeat the experiment multiple times with the exact same initial apparatus, the cat is alive in some cases and dead in some. So this cat represents the behaviour of fundamental particles according to quantum mechanics. For people who didn’t understand a thing, quantum mechanics just defines a world where you never can predict the future in a perfect manner. The most famous scientist in our generation, Albert Einstein quoted “God doesn’t play dice” and was seemingly not happy with this development, while on the other side I think we should be grateful for it as this give us some kind of hope that some combination of this fundamentally random particles gives rise to free will in ourselves thus making the decisions we make on what to do with our life all the more important and thus thinking about it meaningful. These videos by Veritasium and Vsauce have a better shot at explaining this if all I just said went over your head.

Human Colossus
So till now we established the fact that there is still a possibility that we might actually possess free will and ability to make a decision what to do in our life. But the main question still remains unanswered – Why does it matter? So here’s a long (very loooong) blog by Tim Urban(which i shall break down more in my future blogs), which introduces a concept of humans as cells of big human colossus. Once you consider that image, it becomes a lot more clear about things one should not be doing, like avoid becoming the cancer cell as Vishen Lakhiani explains perfectly in this video, understand that whatever you do impacts the complete Human colossus and try and do things that serves a greater purpose, the collective goal of the Human colossus.

The collective goal
There is always plenty of existential questions which the colossus try and answer by pushing the boundaries of knowledge and by advancing itself and ensuring its survival long enough to find all the answers all while always trying to have fun along the way.

PS: Currently we(the colossus) need to get our shit together because we are farting badly, and thus climate change. More on that in future blogs. Till then go here. 😛

 

The coding weeks are over!

So people, the coding weeks are over. This post is for a reference to the work done by me during this period highlighting the goals achieved and the outstanding work.

The task was to develop Gigaibit Ethernet Media Access Controller(GEMAC) (the MAC Sublayer) in accordance with IEEE 802.3 2005 standard using MyHDL. The aim was to test and help in the development of MyHDL 1.0dev, also demonstrating its use to the other developers.

In brief, work done includes developing Management Block and Core Blocks, i.e., Transmitter and Receiver Engine with Address Filter and Flow Control. Work left includes developing the interfacing blocks, i.e., FIFOs (Rx and Tx) and GMII.

Post Mid Term I started implementing core blocks. Midway I realised that I would be better off using Finite State Machines to implement these, which led me to rewriting the whole blocks. Currently, I am looking towards implementing the interfacing blocks; FIFOs (for which i shall try and use already developed blocks by other developers) and GMII(depends on the PHY, I will be using the one that comes on Zedboard, meaning i would be developing RGMII).

Tests for each blocks were developed using Pytest. Seperate tests were developed to test each unique feature and ensure its working. Also convertibility tests were developed to test the validilty of the converted verilog code which shall be used for hardware testing in the end.

Main Repo : https://github.com/ravijain056/GEMAC/

Links to PRs:
1.Implemented Modular Base: https://github.com/ravijain056/GEMAC/pull/1
2.Implemented Management Module: https://github.com/ravijain056/GEMAC/pull/4
3.Implemented Transmit Engine: https://github.com/ravijain056/GEMAC/pull/5
4.Implemented Receive Engine: https://github.com/ravijain056/GEMAC/pull/6

My main focus after I am done with this is to make this code approachable by other developers by providing various good examples of using the library.

Finite State Machines

Well Its been tough couple of weeks. Proceedings in my university has caused me to slow down a bit. But i have been making progress. I was working on RxEngine Block when things got too complex and I decided to take a step back and refer to use of Finite State Machines(FSMs) to develop much more simple and readable code. As it turns out i ended up rewriting the TxEngine Block from scratch as well. In midst of all this the file system in my local repo got too clumsy as i had multiple versions of RxEngine implementation and planned to wait out for the final revision of the blocks to avoid problems with commits later on while rebasing. I shall push the latest code in a day or two for review.

While implementing TxEngine block using FSMs I added the underrun functionality which was remaining in the previous implementation. Also I did a rough implementation of Flow Control Block which accepts request from client to send pause control frames and triggers the TxEngine for the same.

Also i had discussion about how to provide clocks to sub-blocks and handling the reset with Josy, one of the mentors, who suggested providing clocks to sub-blocks directly in the top block as opposed to relaying them through the sub-blocks. A good reason that i can think of to support it is that, if your system is a bit big and complex it might cause problems in simulation. I shall discuss more about it in detail in upcoming blocks.

Started Receive Engine!

Its been a long time since my last post!(2 weeks phew)! Sorry for the slump. Anyways During the period i successfully merged Transmit Engine after mentor’s review. I later realised that i missed adding functionality of client underrun used to corrupt current frame transmission. I shall make sure to add that in next merge.

Next I started looking towards GMII, which partly stalled my work cause i was unable to clearly understand what I have to do for that. So I decided to move on and complete Receive Engine with Address Filter First. Till now i have finished receiving the destination address from the data stream and filtering using the address table by matching it against frame’s destination address. If there is any match, the receiver starts forwarding the stream to client side, otherwise just ignores it.

Next i look forward to add error check functionalities to be able to assert Good/Bad Frame at the end of the transmission.

CRC32 : Transmit Engine

Completed first draft of implementation of Transmit Engine. The implementation was fairly straightforward barring the calculation of CRC32(Cyclic Redundancy Check) for Frame Check Sequence.

It stalled me for a day or two requiring patience while reading and understanding the type of implementations. A very painless tutorial for understanding crc32 and its implementation from ground-up can be found here. This implementation in C also helped

Now I have generated pull request for code review.

Started Transmit Engine!

Yay readers, good news. I got through the mid-terms and received the payment. Feels good!

About the project. I started of with implementation of transmit engine. Sweeping changes had to be made in the the interfaces of the sub-blocks. Notable changes:

  • Removal of Client Sub-block and interfacing the FIFOs directly with Engine.
  • Addition of intrafaces, i.e., interfaces between the sub-blocks.
  • Moving the configregs(Configuration Registers) and addrtable(Address Table) out of management block to main gemac block to improve its scope to other subblocks. Now its accessed by management block through ports.

As a result of changing the ports of management block i had to edit the test_management to reflect the change. I had independent instantiation of the management block in every test which was redundant. I then looked up into pytest fixtures which enabled me to have a common function which would be run before every test thus removing the redundancy. It provides convenience to change the block port definitions in future if needed.

Now i am working on implementing its features. A little about Transmit Engine:

“Accepts Ethernet frame data from the Client Transmitter interface,
adds preamble to the start of the frame, add padding bytes and frame
check sequence. It ensures that the inter-frame spacing between successive
frames is at least the minimum specified. The frame is then converted
into a format that is compatible with the GMII and sent to the GMII Block.”

GSoC: Mid-Term Summary

Well four weeks of GSoC is over, and its time for the mid-terms summary and replanning.

Mid-Term Summary:

  • Studied about MACs and their working. Chose Xilinx User Guids 144 (1- GEMAC) as interface guide and reference verilog design as features guide.
  • Completed setup of main repo providing the modular base for further development.
  • Implemented Management Sub-block.
  • Setup the repo with travis-ci build, landscape linting, coveralls.

So comparing with the timeline in the proposal i have achieved targets of first four weeks switching the management and tx engine modules.

Further Plans:

  • Take three other sub-blocks mentioned in the proposal timeline and try and implement them in a week each (rather than two weeks as proposed).
  • Implement wrapper blocks including FIFO and client in the next week.
  • In the remaining weeks, hardware testing, refactoring code if necessary, setup of readthedocs shall be done.

Maintain A Clean History!

Finally I have completed and merged the management module. Last time I posted, things i needed to be able to merge was to add the doc-strings, setup coveralls, resolve conflicts with master branch(rebase).

Adding Doc-Strings was the easiest but still took time as it gets a little boring(duh!). I used this example provided by my mentor as reference.

Now came time to do a coveralls setup, which i must say i a little more complex compared to the others. I really got a lot of help from referencing an already setup repo test_jpeg on which a fellow GSoCer is currently working. It got a little tricky in between as i stumbled upon the type of end-of-line character problem. Before this i didn’t even know that even an “type of enter” can cause problem in running scripts. It consumed my one whole day. It bugged me when i was trying to edit it in notepad on Windows. This post later helped get me over it. More on coveralls setup on my next post!

Next Rebasing and resolving conflicts my dev branch compared to master branch. When i started my master branch was a few commits ahead (setting up of badges) and thus was having conflicts. Also Rebase was required as my mentors suggested to maintain clean history in the main branch. It took me lot of experiments to finally understand the way to go for rebasing my branches. The structure of my repo:

  • origin/master
  • origin/dev
  • dev

So i have a local dev branch in which i develop my code and constantly push to remote origin/dev branch for code reviews by my mentor. This leads to lot of commits containing lot of small changes and resolving silly issues. But when i make a pull request and merge onto origin/master branch I wish to have cleaner commit history.

So doing an interactive rebase helps to modify that history using pick(Keep the commit), squash(Merge onto previous while editing the commit description), fixup(Merge onto previous keeping the previous commit description intact). Understanding this required me doing lot of experiments with my branch which is dangerous. So I had made a copy of my dev branch, which i suggest you do right now before continuing.

To rebase your local branch onto origin/master branch use “git interactive -i <base branch>“. Warning, avoid moving the commits up or down if the are working on the same file. This may cause conflicts. Once it starts, Resolving conflicts is lot of pain because it triggers other conflicts as well if not done properly.

After rebasing come the trickier part. Your local branch has brand new rebase commits and your remote has old commits. You need to use “git push –force”. It will overwrite the commits on remote branch after which you can generate a pull request onto origin/master. Don’t do it if there are other branches based on this branch In that case directly merge onto master, downside being you wont get to be able to make pull request on which is essential for code discussions.

After all this my code was ready to merge and i got go ahead (after a day of internet cut, hate that) from my mentor to merge it. So i had finally completed second merge on to my main branch implementing the management block and setting up coveralls.

 

GSoC Update: Stuck with conversion

In the past week I implemented new feature for management block – Address table read/write which shall be used for address filtering purposes and updated the test suite accordingly without much problems.

Then I started looking for cosimulating the management subblock that had been implemented. It took me a while to understand the concept. After talking with my mentors, i chose to leave cosimulation for verification of converted code for top level constructs and use simple convertible testbenches to verify generated V* codes for subblocks.

While pursuing that i faced a lot of issues and uncovered some issues with the conversion about which i have posted in discourse(Verilog VHDL) in detail.

So next,  i should develop tests for the myhdl core to cover some of the issues mentioned in the discourse and make a pull request. After that I should get done with this module within this week, completing it will good documentation as well.

GSoC: Management Block – Half Work done!

Its been seven days from my last blog update. Time Flies! One thing i wish i wouldn’t have done this days was to procrastinate blogging(My org requires me to blog 3-4 time a week).  Other than that, this week has been the best one yet. It certainly gets much easier once you start off.

During  my last blog as I said I was still figuring out the proper way to use FIFOs with the GEMAC core and the interfaces. While going through the Xilinx user guide 144 document i found that they had documented the interfaces of the FIFO too. So the problem of figuring out the interfaces was solved easily. Next I mentioned I will be developing some tests for the GEMAC core as a whole. Well I couldn’t do that cause I had never written tests before and I didn’t really have the experience to be able to identify the transactors(structure) of the package before really implementing it.

So I decided to move on and pick a module or the block and implement it(Term used in MyHDL as the term ‘module’ clashes with a different term with different meaning of Python; I will be using term ‘block’ for the rest of my blog series to refer to hardware module). I chose management block to start with as opposed to Transmit Engine as mentioned in my Project Proposal because i thought that it would be better to know the configuration registers based on what the other blocks base their behaviour on.

To start with again I couldn’t start with the tests (and my org suggesting having a test-driven approach) for the same reason as above. So instead I started of by implementing the features of the block directly. First feature I added was to read/write the configuration registers(basically a list of signals) present in the block. Then I quickly moved on to another feature, converting host transactions into MDIO MII transactions, which is roughly speaking, converting parallel data into serial data(write operation) and vice-versa(read operation).

Well once I was completed with the first version it was pretty clumsy and nowhere close to one the people at the org desired. On my defence, it was my first time doing developing a real block other than just writing some silly examples here and there. At this point I made a pull request to the main repository, knowing that the block isn’t complete yet, desperately in need of reviews from my mentors. Beware till this point I had done no simulation or testing to whether my code was correct.

Well the reviews came in and i started adding tests that simulated the features using the new API presented in the MEP-114 one by one. One small change from the API mentioned is while using ‘inst.config_sim(trace=True)’. Note that there is no parameter ‘backend’ for config_sim unlike mentioned in the MEP(MyHDL Enhancement Proposal). I used Impulse plugin in Eclipse to view the traced Signals produced by simulation.

I added tests one by one and followed something like Test-First Approach after that, i.e., adding test and tweaking the code until the test passes. The tests added for MyHDL simulation till now are read/write configuration register, MDIO Clk generation(Generated using host_clock), MDIO write operation. This lead to what I would say, second version of the implementation, which was still far away from the current version., quite complicated but still passing the tests. I learned use of TristateSignal and decided to leave it until the top level block to implement.

Next came the test which checked the convertibility of the code to other HDLs, which taught me quite a few things which helped me to relate the code written to the actual hardware implementation. It led to me optimising my code a lot better and trim down on unnecessary stuff, providing the current version of the code, which was convertible.Things I learnt while making the code convertible:

  • You cannot use yield statements in the processes.
  • You can have class with Signal as attributes for local Signals.
  • You cannot use class functions in the procedural code.
  • In the local functions, all the Signals used should be passed as arguments, regardless of the scope. Classes mentioned in the first point cannot be passed as a argument.
  • Signals can be driven only through one of the processes, i.e, you cannot perform ‘sig.next = <some value>’ in two different processes.

Now i shall refine and add some more tests for the next two days and after that work on cosimulation of the block.