On a message board, I read a thread where a poster -- a research scientist -- was describing how he ended up becoming the defacto IT guy in his department simply because of his superior Google skills and willingness to Google for and apply solutions to fix issues for his colleagues.
This is something I've personally never been asked to do in an interview nor have I thought to ask others when I interview them, but it seems that being able to quickly Google and sift through results quickly to separate the wheat from the chaff is a skill that is supremely underrated in today's world of software engineering.
The fact is that developers and technology specialists today need to deal with so many technologies and understand deep nuances, Google is often the only way that any of us can get anything done, especially with obscure errors and what not that Microsoft and SharePoint loooove to throw at you.
In fact, I'm quite surprised that I've never been asked to do a Google search speed and accuracy test.
How would one design such a test to be effective at measuring a candidate's speed and accuracy at using Google? Should the topics be relevant to the candidates job domain? Or should it be more generic? Should it test a candidate's knowledge of Google's advanced features?
One of the lessons I've been mulling about the past few weeks is the importance of scope when delivering software.
Delivery of software can be thought of as a balancing act between three buckets:
- Time - this is the schedule and how much time you have to complete the work.
- Money - this includes everything from spending on more resources, better resources, tooling support, and so on
- Requirements - this defines what you're building
These are the three basic buckets that constrain the scope of what can be done and they react to each other in different ways. If there are more requirements or the requirements are inflexible, then it necessitates more time or money or both. If both requirements and time are inflexible, then more money will be required to achieve the goals within those limits. If money is constrained (less resources), then you must allow more time to deliver the requirements or trim the requirements.
Having been in consulting and in software development, each project has different priorities on which is more important and these priorities drive the sizing, cost, and pace of the project.
But in software development, I think one thing that I think many folks -- even experienced individuals -- get wrong is the fixation and inflexibility on requirements. I think that requirements are really much more fluid in software development projects as compared to contract-driven consulting projects. The reason is simple: in software development, the assumption is that there will always be another version and there will always be another point release; this is exactly what roadmaps are for.
Plan a solid roadmap and if you don't get a particular feature or capability in this release, you and your customers should have a good idea of which release it will be in down the road.
Some tend to lose sight of this and think that all features must be shipped in a constrained timeline. I think this is a losing proposition that forces a team to compromise on quality by artificially forcing requirements into a release when the reality is that there are often really critical features that must be delivered and there are nice to have features that would be great if they could be delivered. Teams and leaders without discipline and a focus on the criteria of success will have a difficult time discerning the two and lump nice-to-haves right alongside the critical work items. This is a recipe for failed projects, missed deadlines, and poor quality.
The reality is that software rarely comes out fully baked unless you're NASA launching a space mission worth billions of dollars and you only get one shot; most teams are not under such otherworldly constraints. There will always be something that could be better or easier to use or some missing feature discovered along the way or some new idea. The trick for teams that succeed is being about to create boundaries on what is needed now.
Apple shipped 7 versions of iOS before adding support for third party keyboards and NFC.
NPR's first mobile site was terrible (I don't have a screenshot of it, unfortunately), but at least they shipped it and their audience could use it and now they've evolved it to be a clean, responsive web site.
Here's what the an early version of Facebook looked like:
Microsoft shipped Azure without support for virtual machines until 2013.
But what if instead of new features, we're talking about bugs or design flaw? There is an important calculus at play here and I think that one way to think about it is like this: if there is a bug or design flaw that is preventing increased growth in userbase or revenue, then that takes precedent over any other bug or design flaw that is in the shipped system. Think about it this way: if you ship four versions of the software with the bug or design flaw, chances are, you can probably ship a fifth without addressing it (of course, this is not always the case, especially if the flaw is related to security). But if a bug or design flaw is stopping adoption or holding back revenue, then that flaw automatically becomes the most critical focus for a release.
The point is that in software product development, more often than not, the winning strategy isn't to get it perfect (as much as Steve Jobs would have you believe that he got it perfect each time, the fact that there was a next release meant that it was intrinsically not perfect -- there was always something to improve or add); it's to get it out and ship it and acknowledge that there will be a future release to add features or other improvements. This really allows the team to focus on what's critical now and get it out the door and on time.
To that end, roadmaps are important as a communication tool and a lever for controlling scope because it gives customers visibility and a sense of certainty that while feature X is missing in this release, in 3 months, it'll be in the next point release. It's important because it helps manage the requirements bucket; without a roadmap, the tendency of the team and customers -- in my observations -- will be to assume that every requirement is critical. It's a purely psychological notion because the lack of the roadmap makes it difficult to allow the team to offload some ideas and lesser requirements so that the team can focus on the requirements that are truly necessary to ship it. Without the concrete notion of The Next Release, the feeling will be that everything must be crammed into this release.
Ultimately, I think that for software development teams to successfully ship software -- given the typical constraints of time, money, and requirements -- it's important to be able to take a critical eye to the requirements and really be able to categorize and define the scope of what is critical versus what is nice to have. A clear roadmap is an important tool to help teams organize work and thoughts as well as communicate intent to customers.
I was reading an NPR piece on worker burnout and some different tactics taken by different companies to deal with it and came across a very nice, concise definition:
Christina Maslach is a professor at the University of California, Berkeley, whose four decades of research on the subject helped popularize the term "burnout." Maslach says it's a loose term that encompasses a combination of work overload, lack of autonomy and reward and social and moral discord at work.
This sentence very concisely summarizes the key drivers of burnout and the factors at play are not as simple as "too much work".
The article also brings up an interesting observation (well, it's just the next few paragraphs):
Most burnout stems from interpersonal strife, but most employers see the solution as time off, she says.
If companies really want to know what's causing burnout in their workplace, Maslach says, they shouldn't just mandate more time off. They should assess the core problem, then design solutions to mitigate those issues.
"When it's time off, I mean, that might be time away from work," Maslach says. "Maybe you're addressing issues of exhaustion, but it's not really addressing what may be the problems at work."
Ultimately, a company, a project, a product -- it is the effort of many individual humans who must come together to fulfill a common goal. And when humans are involved, conflict is sure to arise. Obviously, you can still get things done when not all of your parts are in harmony, but isn't it much more enjoyable when they are?
I hardly consider myself an expert, but in my own experience, I've found that it's a good idea to work to reinforce those relationships between the people that comprise the team through team activities. A common one is eating together with one another or occasionally taking the whole office to lunch or dinner. It is especially important for management to be involved because it shows that the employees are valued as people and not just as fungible parts of a machine.
Andrew Fitzgerald comments in that NPR article:
One day I got called into the boss's office. I was thinking to myself "Shoot! What does this guy have on me now? They called me in just to tell me that they thought I was doing a good job and that they appreciate my work ethic. I didn't make a lot of money. The work was kind of tedious and repetitive but I could not tell you how good that made me feel. A little positive feedback from the higher ups goes a long way.
At IC, the development team is in a unique position because all of us work remotely and travel to Irvine. So we end up spending quite a bit of time together eating meals, going to the shooting range, kayaking on the weekends, and I'm planning on taking the team to an indoor climbing facility as well (I try to keep things fresh). I also try to make sure that everyone is taken care of; there is nothing I won't do from picking up lunch, driving a co-worker to a train station, picking up fruit for everyone to share, and so on. Not just because I manage them, but because I like and respect these guys as people first and foremost.
Even at a basic level, we sit together in the office and chit-chat from time to time about random things and watch random videos after we've been hacking away for 8 or 9 hours. When we are on site, not one member of the development team leaves before the others. Not because anyone is forced to stay, and not because we have some unspoken code about such actions or that we would shame anyone that did, but I think because we all feel that we are in this together and that truly, we have a common goal to achieve as a team.
And that is an important point, in my opinion, because too often, how leaders fail is by not aligning all of the cogs of the machinery towards a common goal. Most of the time, that simply involves clear and open communication about expectations, company goals, and an understanding of the priorities of the company or the team.
Like a train with two engines heading in opposite directions, failure by team leads to align the members of a team to a goal or failure by management to communicate expectations and priorities seems to lead to inaction, indecision, and conflict when team members are trying to pull in opposite directions. Ultimately, this just help feed into worker burnout.
In the wake of the Apple iCloud debacle, there has been a lot of discussion on what Apple has done wrong, what it could do better, and how this could have been prevented.
This is not a blog post about 2-factor authentication or proper implementation of authentication channels or how Apple should be more open in their dealings with the security community, but something more basic and common sense: give users more granular control on what gets backed up.
You will see in many discussions and comments to articles that there is quite a bit of "victim shaming".
But I think that this is quite unfair and I postulate that an average smartphone user has no idea that their photos are being synced to the cloud. It is far more likely that users had no idea that these photos and videos were synced to the cloud in the first place and even if they had an abstract idea that it was (for example, you take a photo on your phone and you can see it on your desktop later), they had no concrete idea of the implications (those photos are now resident in the cloud as opposed to transient).
It is easy to imagine that such things are obvious and should be trivially easy to configure and control to the end users, but I think that this is a poor assumption to make by anyone that is technically savvy; people like my mother and wife really have no idea about these things. My guess is that Jennifer Lawrence and Kate Upton simply had no idea that their photos and videos were sitting resident in the cloud and even if they did, they probably couldn't figure out how to get rid of them.
Some have said that this is not the fault of the OS makers or app makers. Google Photos, for example, gives you a very clear screen when you launch it for the first time asking you if you'd like to sync the files to the cloud. But one problem is that users may not actually read these things before agreeing. The other is that even after a user agrees, if the user decides that she wishes to change her mind, the setting is turned off from a screen that is three levels deep (launch Photos, click Menu, click Settings, click Auto Backup). While this is very obvious to some, to many -- like my mother -- this is an absolute mystery. She has no idea that it's syncing her photos and has no idea how to turn it off.
I think that there are many common sense solutions that can be implemented outside of the security measures implemented above to give users more granular control over their content.
Give Periodic Notifications to Update Privacy Settings
One simple idea is that say every three months, the phone prompts you with a notification in your notification bar:
This would allow users to periodically be reminded that things like automatic sync are on and that they have the option of turning them off. The user is free to ignore it, but it would give them at least a reminder that "Hey, I'm sending your stuff to the cloud, are you OK with that? Do you want to review your settings?"
Make Synchronization Explicit
One of the problems I have with Google Photos is that it's all or nothing by default. There isn't a middle ground that allows me to sync some of my photos as a default.
The user experience paradigm here would be much like that of Facebook where you can post photos by selecting them from your album to explicitly and with fine grain control what gets sent to the cloud. Likewise, iCloud and Google Photos would do well to allow a middle ground that gives users more fine grained control over what gets sent to the cloud instead of ON and OFF.
In discussions, some have said that this would present too high a burden on end users, but it seems to work fine for Facebook and I think that it would be relatively easy to implement in an easy to use manner:
If the user selects "Sync All", then all 20 new photos are synced to the cloud (be that iCloud, Dropbox, Google Drive, etc). If the user selects "Choose", the user is given a screen that allows the user to explicitly pick the ones to sync. The pick screen should prompt the user to "Ignore unselected items for future backup?" when selection is complete so that any unselected photos are simply ignored next time. If the user selects "Don't Sync", then do nothing.
A simple design like this still gives the user access to the convenience of cloud backups while giving them explicit, fine-grained control and acknowledgement that their data will be stored in the cloud.
The victim shaming is simply not warranted; whether these individuals should or should not have taken these compromising photos and videos is not the right question to ask. The right question to ask is whether Apple or Google should be automatically syncing them to a resident cloud storage without finer grained controls and explicit consent.
The principle is that any time you -- as it were -- voluntarily let up control.
In other words, cease to cling to yourself; you have an excess of power because you are wasting energy all the time in self defense.
Trying to manage things, trying to force things to conform to your will.
The moment you stop doing that, that wasted energy is available.
Therefore you are in that sense -- having that energy available -- you are one with the divine principle; you have the energy.
When you are trying, however, to act as if you were god, that is to say you don't trust anybody and you are the dictator and you have to keep everybody in line you lose the divine energy because what you are simply doing is defending yourself.
One mistake that I've been guilty of is to try to force things to conform to my will on various projects (I still do it to varying degrees!). It is usually with the best of intentions -- for a cleaner framework, a better product, a more efficient process -- but at the same time, it is true that a lot of energy is spent wasted in doing so.
What is the alternative, then?
I think Watts is right that a level of trust has to exist that the team around you can help you achieve your project goals. Instead of expending the energy in controlling the members of the team, spend the energy in building that trust through training, mentorship, guidance, and giving up not just control, but responsibility.
Sometimes that trust will be unwarranted, but sometimes, that trust will pay itself back many-fold.
Another finding was that the perception of heat could affect the participants’ emotions. The research team said that this is because emotions are more intense under bright light; thus, leading to the perception that light is heat, which can trigger more intense emotions.
The team also found that bright light affects the kinds of decisions people make. Since the majority of people work during the day under bright lighting conditions, the researchers noted that most daily decisions are made under bright light, which intensifies emotions.
Accordingly, they suggest that turning the light lower may help people make more rational decisions, not to mention negotiate better settlements in a calmer manner.
Taking emotion out of decision making is one of the most important skills one can develop and maybe it can be as simple as installing more ambient lighting and turning off those (ugly) fluorescent lamps overhead!
From Peopleware, 3rd Edition:
The propensity to lead without being given the authority to do so is what, in organizations, distinguishes people that can innovate and break free of the constraints that limit their competitors. Innovation is all about leadership, and leadership is all about innovation. The rarity of one is the direct result of the rarity of the other. (p. 101)
My sister-in-law pinged me for some tips to prepare for a long, multi-session interview coming up.
I've been on both ends as interviewer and interviewee (mostly interviewer) and I seem to have been pretty successful as far as interviews go, so here are my tips (your mileage may vary):
- There is no preparation. It's like an exam. Either you know your stuff or you don't. Zen out and accept that you may not be able to answer/know everything and that no amount of last minute cramming new material will help. The more you worry about the material the bigger of a problem you are creating for yourself because you'll just be more anxious.
- Have faith in what you do know. Whatever it is that you do know, you need to be able to communicate it effectively. You need to communicate your knowledge, your skills, and your character effectively. You are in an interview because someone likes your resume and the experience that you have so you have to be able to communicate that experience effectively.
- Say "I don't know" if you don't know. I interview a lot of people and one of my pet peeves is when I give a tough question and the interviewee won't just say "I don't know". Sometimes, the questions are designed to be hard so if you don't know, don't dance around it and be frank so you don't waste anyone's time.
- If you really think it's an interesting question, you can say "Hmm, I don't know, but I've never thought about it like that..."
- Or "I don't have a background in that topic, but it's interesting, how would you approach it?"
- Or "Oh, that's an interesting question; I've honestly never thought about that before". That leads me to my next tip...
This is, in fact, a trait that I am looking for in an interview because it lets me know that if an individual gets stuck, he will quickly raise his or her voice and let me know so I can help them get unstuck and that this individual is willing to ask for help.
- Make it conversational. The more you treat it like a grilling, the more it will feel like you're over a fire. I treat every interview like a conversation and treat every question like a conversational discussion. The interviewer is a conversation partner and not a superior or an interviewer. This also leaves a lasting impression on them because they feel like you are someone they can easily talk to and people like to work with people they can talk to. Also, you are always free to turn the tables and interview your interviewer; remember, an interview is a two way street: they want to know if they should hire you and you want to know if you really want to work with these people.
- Dress sharp, watch your posture, and give a strong first impression. Harvard studies have shown that posture has a strong influence not only on how others perceive you but also how you perform. Stand up straight. Sit straight. Shoulders back. Project confidence but also look relaxed. Use hand gestures to help communicate. Make eye contact -- don't lock it, though -- that's freaky. Using full body communication is important but remember not to fidget. Basic stuff.
- Remember names and call people by names. When you meet someone and greet them, they will present themselves and always call them by their name immediately. Interviewer: "Hi, I'm James"; you: "Hi, James, I'm Lindsay, pleased to meet you". It's subliminal, but people like to hear their names and it helps you make an impression in your brain so you know who he was. At the end, repeat the interviewer's name: "James, I really appreciate your time".
- Don't forget to drink water. When you talk a lot, your mouth and throat will get dry and if you don't hydrate, it will impact your ability to speak. Hit the restroom when you get a chance, even if you don't "need' to go because if you get the urge during a discussion, it will distract you.
- And final tip is to create mental checkpoints. One thing that happens with me is that because I treat an interview as a conversation, it is easy to lose the original question or topic in a long discussion. So you have to make a mental checkpoint and be able to bring the discussion back to the original topic to answer the question. You don't want to be in a position where you have to ask "Sorry, what was the question?"
Silicon Valley, of course, is known for its casual dress, which means t-shirts, jeans and sneakers. But don't be fooled, techies care a lot more about fashion than they let on. Or put another way, there’s a lot of code in the Silicon Valley dress code.
In fact, engineer Alexey Komissarouk boasted he could tell if people were in tech and what they did by just looking at their dress. I met him a few months ago at the FWD.us hackathon and I asked him to show me his super power. He agreed and we met in downtown Palo Alto.
Before we got started, Komissarouk explained that the Silicon Valley is full of tribes: there are the engineers, designers, product managers, salespeople, entrepreneurs and VCs. And each tribe has its uniform.
The engineers? T-shirts, jeans and hoodies, of course.
“Hoodie signals young talent,” said Dan Woods, a techie we stopped on the street.
Woods walked by us and Komissarouk nudged me and said, “That guy, he’s a VC.”
The tip off? A zippered v-neck sweater.
“That’s like classic VC and then you got the button down underneath it, that’s like the classic uniform,” Komissarouk said.
We stopped Woods and asked him. Turns out, he did work in venture capital, which is about when he got the sweater.
Turns out the uniform is a long time tradition in tech, says Erik Schnakenberg, a co-founder of Buck Mason, a start-up that sells men's clothing online.
"I wear a pair of jeans and a black t-shirt almost everyday," Schnakenberg said. "It's one less thing to think about."
In the fast-moving world of tec, the idea is to show that your'e not wasting precious time on something as vain as fashion. Schnakenberg says the uniform hasn't changed much but tech is attracting a lot more of the cool kids and they care about fashion.
It's also why I keep my head shaved myself. One less thing to think about when I roll out of bed.
...at work it’s strictly blue or gray suits. “I’m trying to pare down decisions. I don’t want to make decisions about what I’m eating or wearing. Because I have too many other decisions to make,” he tells Lewis. “You need to focus your decision-making energy. You need to routinize yourself. You can’t be going through the day distracted by trivia.”
A proposed rule of thumb for planning office spaces for development:
- Buy cheap desks.
- Buy expensive chairs.
Our office has it all wrong. We have desks that are solid as a rock but probably $600+ and seats that are $99 Office Max specials (just a guess). I suspect that this is the case with most offices where the planners spend a fortune on desks, dividers, cabinets, and so on (just go look up the prices of cubicle partitions from Hon or Steelcase) and go cheap on the seating.
The problem is that the majority of an office worker's time is spent sitting. Splurge on the chair instead. Don't go too cheap on the desks; Ikea Galants are perfectly stable, durable, lightweight, easy to hack, easy to refactor, cheap, and assembling them is a great team building exercise
There is no exception to this rule. If you want sit-to-stand type desks, just get something like this:
As an added bonus: if you have top candidates coming into the office for interviews, they are far more likely to be impressed by fancy chairs than fancy desks.