Archive for Review

Mini Book Review: Artificial Intelligence for Humans, Volume 3: Deep Learning and Neural Networks

The book

Artificial Intelligence for Humans, Volume 3: Deep Learning and Neural Networks. I read this one a good while back but have been meaning to revisit it to see if there’s anything I missed.

cover

Contents

Jeff Heaton’s book provides an overview of the major Neural Network models in common use as well as a summary of deep learning. It also gives a simple explanation of training algorithms and where they are appropriate. As a relative beginner to the topic, I felt like I was the exact target audience.

Where it goes right

Although 345 pages can be made to seem long if the topic is dry, this one is definitely an easy read. As opposed to other materials on the subject, the author kept the maths very light. If you’re looking to be able to hold a conversation about what a neural net does and why you might use them then this is the book for you. Although I had been studying neural nets and machine learning before picking it up, I felt like certain concepts were explained very well and hearing them in plain english helped reinforce my understanding.
Another way I personally benefited was Heaton’s discussion of trends. As a hobbyist, it’s great to see why certain ideas or algorithms have gone out of favour by the people who use this stuff day in day out.

Where it goes wrong

It’s all a little too simplistic. I came away from it understanding everything I had been told, but was left wanting. A few more chapters that took the level of detail to the next level would have been great. I think anyone who already has a good fundamental understanding of neural nets (i.e. have implemented a few) and machine learning is not going to get much out of this volume. If there had been more concrete examples I think it would have been a bonus. Although it’s in the title, I also felt that deep learning could have been more in depth, although I’d let that slide given how much it’s evolving.

Score

Solid 4.0/5. Like I said, I’ve been meaning to re-read it which is a good sign. The “ELI5” approach helped certain concepts that I had already come across sink in, but it wasn’t revolutionary.

Code Review Tips – make the dreaded less dreadful

I’m a massive fan of peer review. I think when it’s done right, i.e. with everybody spurred on by willingness to improve, code reviews are a very powerful tool. I’ve laid out a few points I like to keep in mind. Most of it is common sense but there’s no harm in a reminder every once in a while:

Size matters

It’s not the size of the review that counts, it’s the way – whoops wrong blog. Keep the size of the review to a few hundred lines maximum. If it’s too big then split it up into multiple reviews, preferably divided logically e.g. review 1 is the business layer logic and review 2 is the persistence logic. This also means you can have one bit being reviewed while you’re finishing the next bit of work.

Die young

Related to the last point. Ideally, you don’t want everyone reviewing code the last 2 days of the sprint and then fixing a whole load of issues on the last day. Just like with your code, you want to fail early so you can recover. If your user story is huge (ignoring the potential smell) and is going to take the remainder of the sprint to finish then try and get a minimum viable product before that, get it reviewed, and then use what you’ve learned to improve the next iterations. Light reviews early and often.

Diversity

Get a mix of reviewers and expertise. Obviously it’s great to get the resident SQL expert in when your review has a load of SQL queries so that potential issues are discovered but code reviews are just as much (or more, in my opinion) about people learning about what’s going on and how to improve their own approach. Get the new guy/girl involved; their unique perspective might find issues that others would miss but it’s also a valuable learning experience.

LIMIT 3 OFFSET 0

Diversity doesn’t mean everybody on all reviews. 2-3 people should be fine, any more than that and it either gets hectic or some people won’t participate and quality suffers.

“True wisdom is knowing what you don’t know” – Confucius

Leave your ego at the door Everyone writes dodgy code and it happens for a multitude of reasons. The purpose of the review isn’t to have a go at you for not writing the optimal solution or to find gaps in your knowledge; it’s to provide opportunities to learn from each other and hopefully to improve the code base for everyone.

Stand your ground

Almost contrary to the last point, don’t assume the reviewer is correct. If someone comes along and tells you to use a hashmap on line 20, don’t just say “yes, sir, would like a cup of tea with that?” and change it without asking why. As a reviewer, it can be hard to know the full context for why things might be done and that custom data structure you wrote might have some benefit the other person missed.

Questions > direct criticism

Related to the last point (again!), make use of the Socratic method when reviewing another dev’s code. Don’t tell them to change things but instead ask them why a certain thing was done, so that an understanding of the best solution arises from both parties instead of one person bowing to the other’s demands.

Resistance is futile

Don’t get bogged down in personal preference. If the developer likes to name variables a certain way because that’s their style and it still falls within company conventions then let them; however if it might affect maintainability then feel free to ask if it’s the best way to do things.

One small step

Lastly, as with all process changes, I recommend making a step in the right direction. Be the review advocate your team needs and lead by example. Don’t worry about spending all day in code reviews. As I said, informal, light reviews are best.