If you're like me, you've noticed that organic milk tends to have a longer shelf life and tastes better so I tend to spend the extra money because aside from our 1 year-old, the family drinks milk erratically and I prefer the taste. But are these properties of the organic nature of the milk? It turns out that there is a simple explanation for both that is quite interesting and may (or may not) change your mind on spending extra on organic milk.
First is the question of longer shelf life. This is actually a result of logistics. There are fewer farms providing organic milk so it often has to travel further and undergoes a high temperature pasteurization process that kills all bacteria:
The process that gives the milk a longer shelf life is called ultrahigh temperature (UHT) processing or treatment, in which milk is heated to 280 degrees Fahrenheit (138 degrees Celsius) for two to four seconds, killing any bacteria in it.
Compare that to pasteurization, the standard preservation process. There are two types of pasteurization: "low temperature, long time," in which milk is heated to 145 degrees F (63 degrees C) for at least 30 minutes*, or the more common "high temperature, short time," in which milk is heated to roughly 160 degrees F (71 degrees C) for at least 15 seconds.
The different temperatures hint at why UHT-treated milk lasts longer: Pasteurization doesn’t kill all bacteria in the milk, just enough so that you don't get a disease with your milk mustache. UHT, on the other hand, kills everything.
Interestingly, UHT treated milk no longer needs refrigeration (prior to opening). Your grocer keeps it refrigerated as a matter of consumer expectation (how silly we Americans are).
The answer to the second question actually arises from the answer to the first. The process of UHT actually changes the chemical nature of the milk by breaking down some proteins and cooking some of the sugars. Organic milk tastes different not because it's organic, but because of the pasteurization process which happens to change some of the molecular structure of the milk:
UHT sweetens the flavor of milk by burning some of its sugars (caramelization)....UHT also destroys some of the milk’s vitamin content—not a significant amount—and affects some proteins
So there you have it; organic milk does indeed taste different from non-organic milk, but it's not a placebo effect and it's not because it's organic. If you're a European in the US and you find our milk tastes funny, try the organic milk.
I may take up this article on giving non-organic UHT milk a try.
One of the key takeaways, for me:
He was challenging me because he expected more from me. When somebody cares about you, that’s when they challenge you. When they don’t care about you, they ignore you. That’s when you should worry.
It is natural for people to react negatively to critical feedback, especially when that feedback addresses a shortcoming or a weakness in some work, some effort that one has invested heavily in.
It is important to step back and detach the emotional aspect of coming up short of perfection and consider whether this critical feedback is dismissive or whether it is a revelation of some unmet potential that we, ourselves, have not yet recognized.
The whole thing is a good read and great piece of motivation to understand what it takes to succeed.
Poorly Controlled Scope
Scope is enemy number 1; it is the amorphous blob that threatens to consume and grow until it is an uncontrollable monster, swallowing all of your carefully planned man hours.
Increases in scope are often the result of failure to manage the customer and expectations. In any given project, there are only so many levers that can be used to control the successful delivery and it is up to the skilled project manager or client interface to toggle these levers of team size, timelines, requirements, and so on.
The worst is when growth of scope originates from within the team as it is a form of cancer that only causes teams to compromise on quality to meet timelines promised to the customer. You see, when scope creep originates from the customer, there is a certain expectation that of course, costs will increase or timelines will need to be shifted. After all, they are asking you to do more than was initially agreed upon. But when new scope originates from the team itself, the customer will not readily accept this delay.
The cost of scope increases is often not well accounted for. A change that takes a developer 2 days to make will cause ripples that force test teams to adjust their scripts, documentation teams to update their documents, and possibly trigger expensive regression testing.
Smart teams and leaders will understand that these can be controlled, in many cases, by simply creating a roadmap and understanding that desired features and capabilities that don't fit into existing timelines can be added to "v.next".
(Over) Reliance on Manual Effort
To a certain extent, software engineering requires raw manpower to execute large projects that require many hundreds of thousands of lines of code and lots of moving parts.
But within the lifecycle of a project, there are many activities that can be simplified by the use of automation. Teams must judiciously balance the cost and effort of the automation versus the savings gained, but more often than not, even a little bit of automation is better than none. It's crazy to think that it was once the case that all phone calls were manually routed between parties.
Nowadays, the idea seems crazy! Imagine if the billions of people on this Earth were to rely on the same processes to connect phone calls today!
Testing is a great example where failure to automate creates a bottleneck to progress. It increases the cost of changes and bug fixes because it increases the cost of regression testing. Make the regression testing virtually free and the cost of introducing changes (whether small scope creep for critical bug fixes) is decreased dramatically.
Technologies like Selenium WebDriver and Visual Studio's built in tooling make it possible to achieve significant gains in productivity when it comes to testing. Don't let excuses hold your team back.
One skilled test automation engineer is worth her weight in gold!
Poor Communication and Collaboration
Strong and open channels of communication are critical for the success of projects, especially so when some or all of the resources are remote.
The flow of information and feedback from the customer to the design and engineering teams must be swift and clear so that expectations are known and any roadblocks can be communicated back. Engineering teams will often have insights into the challenges and nuances of a customer's input and it can be dangerous to agree to timelines or make promises without clearly engaging the teams executing the implementation. Ideas that seem simple on paper or in concept can require massive engineering changes or sacrifices to achieve and not properly estimating this work is a common pitfall.
Demarco and Lister's Peopleware offers excellent insight into how to foster better communication and collaboration between teams.
Often, one of the simplest solutions is to simply talk to each other instead of using emails, chat messages, and worst of all: assumption ("Oh, I thought you already knew that"; we've all heard that one before!). Get in front of a whiteboard and draw out ideas, deadlines, goals, and so on. Go out to eat lunch together. Plan team activities that engage everyone. Make sure that everyone is on the same page on a professional level as well as a personal level.
Not Keeping Your Eyes on the Prize
It's easy for a team to get distracted and lose their focus on the goals of the project and the conditions of victory.
It is therefore critical that teams focus on a goal-oriented approach to the delivery of software projects. This is a mind-set that scales up from daily scrums to weekly reviews and so on. Even a short coffee break can be used to re-orient a wandering team member towards the goal posts. Small, daily victories can help teams build momentum and continuously align towards the long term milestones.
It's important that individuals and teams know, at any given time, what is expected of them and what the priorities of the project are. This allows individuals to make decisions autonomously and with little managerial overhead as they understand how to align themselves with the goals of the project and team. Clear communication of goals allows any misunderstandings to surface early by pinning expectations to milestones -- be they simply daily ones, weekly ones, or project level milestones.
Teams and leaders that are poor at communication and collaboration will often lose their focus on the prize because there is a lack of understanding about shifting goals and priorities; there is a dependence on assumption instead of clearly aligning all parties to a set of well-defined conditions of victory. These anti-leaders will focus on the tasks instead of the goals; it should be the other way around - focus on the goals and derive your tasks from them.
Unwillingness to Compromise
Teams must always be ready to compromise because this is the real world where timelines and successful delivery of usable software matters, but people also have families and life outside of work. Unplanned circumstances arise that challenge the best laid blueprints.
If it is discovered that a feature will negatively impact performance of the system in the current architecture, compromise must be made on either the feature or the timelines to ensure that the desired capability can be delivered as usable software.
If unforeseen circumstances eat into the project timelines, compromise must be made to clearly redefine the scope and conditions of victory.
This is the real-world; man-hours are not unlimited and an unwillingness to compromise when necessary leads to poor quality as a team pushes to make up time.
In many cases, it is a bitter pill to swallow as it may mean telling a customer that a feature must be delayed or built into the next release, but I find that more often than not, openness and clearly communicating these issues as early as reasonable is productive and allows for rational decision making.
On a message board, I read a thread where a poster -- a research scientist -- was describing how he ended up becoming the defacto IT guy in his department simply because of his superior Google skills and willingness to Google for and apply solutions to fix issues for his colleagues.
This is something I've personally never been asked to do in an interview nor have I thought to ask others when I interview them, but it seems that being able to quickly Google and sift through results quickly to separate the wheat from the chaff is a skill that is supremely underrated in today's world of software engineering.
The fact is that developers and technology specialists today need to deal with so many technologies and understand deep nuances, Google is often the only way that any of us can get anything done, especially with obscure errors and what not that Microsoft and SharePoint loooove to throw at you.
In fact, I'm quite surprised that I've never been asked to do a Google search speed and accuracy test.
How would one design such a test to be effective at measuring a candidate's speed and accuracy at using Google? Should the topics be relevant to the candidates job domain? Or should it be more generic? Should it test a candidate's knowledge of Google's advanced features?
One of the lessons I've been mulling about the past few weeks is the importance of scope when delivering software.
Delivery of software can be thought of as a balancing act between three buckets:
- Time - this is the schedule and how much time you have to complete the work.
- Money - this includes everything from spending on more resources, better resources, tooling support, and so on
- Requirements - this defines what you're building
These are the three basic buckets that constrain the scope of what can be done and they react to each other in different ways. If there are more requirements or the requirements are inflexible, then it necessitates more time or money or both. If both requirements and time are inflexible, then more money will be required to achieve the goals within those limits. If money is constrained (less resources), then you must allow more time to deliver the requirements or trim the requirements.
Having been in consulting and in software development, each project has different priorities on which is more important and these priorities drive the sizing, cost, and pace of the project.
But in software development, I think one thing that I think many folks -- even experienced individuals -- get wrong is the fixation and inflexibility on requirements. I think that requirements are really much more fluid in software development projects as compared to contract-driven consulting projects. The reason is simple: in software development, the assumption is that there will always be another version and there will always be another point release; this is exactly what roadmaps are for.
Plan a solid roadmap and if you don't get a particular feature or capability in this release, you and your customers should have a good idea of which release it will be in down the road.
Some tend to lose sight of this and think that all features must be shipped in a constrained timeline. I think this is a losing proposition that forces a team to compromise on quality by artificially forcing requirements into a release when the reality is that there are often really critical features that must be delivered and there are nice to have features that would be great if they could be delivered. Teams and leaders without discipline and a focus on the criteria of success will have a difficult time discerning the two and lump nice-to-haves right alongside the critical work items. This is a recipe for failed projects, missed deadlines, and poor quality.
The reality is that software rarely comes out fully baked unless you're NASA launching a space mission worth billions of dollars and you only get one shot; most teams are not under such otherworldly constraints. There will always be something that could be better or easier to use or some missing feature discovered along the way or some new idea. The trick for teams that succeed is being about to create boundaries on what is needed now.
Apple shipped 7 versions of iOS before adding support for third party keyboards and NFC.
NPR's first mobile site was terrible (I don't have a screenshot of it, unfortunately), but at least they shipped it and their audience could use it and now they've evolved it to be a clean, responsive web site.
Here's what the an early version of Facebook looked like:
Microsoft shipped Azure without support for virtual machines until 2013.
But what if instead of new features, we're talking about bugs or design flaw? There is an important calculus at play here and I think that one way to think about it is like this: if there is a bug or design flaw that is preventing increased growth in userbase or revenue, then that takes precedent over any other bug or design flaw that is in the shipped system. Think about it this way: if you ship four versions of the software with the bug or design flaw, chances are, you can probably ship a fifth without addressing it (of course, this is not always the case, especially if the flaw is related to security). But if a bug or design flaw is stopping adoption or holding back revenue, then that flaw automatically becomes the most critical focus for a release.
The point is that in software product development, more often than not, the winning strategy isn't to get it perfect (as much as Steve Jobs would have you believe that he got it perfect each time, the fact that there was a next release meant that it was intrinsically not perfect -- there was always something to improve or add); it's to get it out and ship it and acknowledge that there will be a future release to add features or other improvements. This really allows the team to focus on what's critical now and get it out the door and on time.
To that end, roadmaps are important as a communication tool and a lever for controlling scope because it gives customers visibility and a sense of certainty that while feature X is missing in this release, in 3 months, it'll be in the next point release. It's important because it helps manage the requirements bucket; without a roadmap, the tendency of the team and customers -- in my observations -- will be to assume that every requirement is critical. It's a purely psychological notion because the lack of the roadmap makes it difficult to allow the team to offload some ideas and lesser requirements so that the team can focus on the requirements that are truly necessary to ship it. Without the concrete notion of The Next Release, the feeling will be that everything must be crammed into this release.
Ultimately, I think that for software development teams to successfully ship software -- given the typical constraints of time, money, and requirements -- it's important to be able to take a critical eye to the requirements and really be able to categorize and define the scope of what is critical versus what is nice to have. A clear roadmap is an important tool to help teams organize work and thoughts as well as communicate intent to customers.
I was reading an NPR piece on worker burnout and some different tactics taken by different companies to deal with it and came across a very nice, concise definition:
Christina Maslach is a professor at the University of California, Berkeley, whose four decades of research on the subject helped popularize the term "burnout." Maslach says it's a loose term that encompasses a combination of work overload, lack of autonomy and reward and social and moral discord at work.
This sentence very concisely summarizes the key drivers of burnout and the factors at play are not as simple as "too much work".
The article also brings up an interesting observation (well, it's just the next few paragraphs):
Most burnout stems from interpersonal strife, but most employers see the solution as time off, she says.
If companies really want to know what's causing burnout in their workplace, Maslach says, they shouldn't just mandate more time off. They should assess the core problem, then design solutions to mitigate those issues.
"When it's time off, I mean, that might be time away from work," Maslach says. "Maybe you're addressing issues of exhaustion, but it's not really addressing what may be the problems at work."
Ultimately, a company, a project, a product -- it is the effort of many individual humans who must come together to fulfill a common goal. And when humans are involved, conflict is sure to arise. Obviously, you can still get things done when not all of your parts are in harmony, but isn't it much more enjoyable when they are?
I hardly consider myself an expert, but in my own experience, I've found that it's a good idea to work to reinforce those relationships between the people that comprise the team through team activities. A common one is eating together with one another or occasionally taking the whole office to lunch or dinner. It is especially important for management to be involved because it shows that the employees are valued as people and not just as fungible parts of a machine.
Andrew Fitzgerald comments in that NPR article:
One day I got called into the boss's office. I was thinking to myself "Shoot! What does this guy have on me now? They called me in just to tell me that they thought I was doing a good job and that they appreciate my work ethic. I didn't make a lot of money. The work was kind of tedious and repetitive but I could not tell you how good that made me feel. A little positive feedback from the higher ups goes a long way.
At IC, the development team is in a unique position because all of us work remotely and travel to Irvine. So we end up spending quite a bit of time together eating meals, going to the shooting range, kayaking on the weekends, and I'm planning on taking the team to an indoor climbing facility as well (I try to keep things fresh). I also try to make sure that everyone is taken care of; there is nothing I won't do from picking up lunch, driving a co-worker to a train station, picking up fruit for everyone to share, and so on. Not just because I manage them, but because I like and respect these guys as people first and foremost.
Even at a basic level, we sit together in the office and chit-chat from time to time about random things and watch random videos after we've been hacking away for 8 or 9 hours. When we are on site, not one member of the development team leaves before the others. Not because anyone is forced to stay, and not because we have some unspoken code about such actions or that we would shame anyone that did, but I think because we all feel that we are in this together and that truly, we have a common goal to achieve as a team.
And that is an important point, in my opinion, because too often, how leaders fail is by not aligning all of the cogs of the machinery towards a common goal. Most of the time, that simply involves clear and open communication about expectations, company goals, and an understanding of the priorities of the company or the team.
Like a train with two engines heading in opposite directions, failure by team leads to align the members of a team to a goal or failure by management to communicate expectations and priorities seems to lead to inaction, indecision, and conflict when team members are trying to pull in opposite directions. Ultimately, this just help feed into worker burnout.
In the wake of the Apple iCloud debacle, there has been a lot of discussion on what Apple has done wrong, what it could do better, and how this could have been prevented.
This is not a blog post about 2-factor authentication or proper implementation of authentication channels or how Apple should be more open in their dealings with the security community, but something more basic and common sense: give users more granular control on what gets backed up.
You will see in many discussions and comments to articles that there is quite a bit of "victim shaming".
But I think that this is quite unfair and I postulate that an average smartphone user has no idea that their photos are being synced to the cloud. It is far more likely that users had no idea that these photos and videos were synced to the cloud in the first place and even if they had an abstract idea that it was (for example, you take a photo on your phone and you can see it on your desktop later), they had no concrete idea of the implications (those photos are now resident in the cloud as opposed to transient).
It is easy to imagine that such things are obvious and should be trivially easy to configure and control to the end users, but I think that this is a poor assumption to make by anyone that is technically savvy; people like my mother and wife really have no idea about these things. My guess is that Jennifer Lawrence and Kate Upton simply had no idea that their photos and videos were sitting resident in the cloud and even if they did, they probably couldn't figure out how to get rid of them.
Some have said that this is not the fault of the OS makers or app makers. Google Photos, for example, gives you a very clear screen when you launch it for the first time asking you if you'd like to sync the files to the cloud. But one problem is that users may not actually read these things before agreeing. The other is that even after a user agrees, if the user decides that she wishes to change her mind, the setting is turned off from a screen that is three levels deep (launch Photos, click Menu, click Settings, click Auto Backup). While this is very obvious to some, to many -- like my mother -- this is an absolute mystery. She has no idea that it's syncing her photos and has no idea how to turn it off.
I think that there are many common sense solutions that can be implemented outside of the security measures implemented above to give users more granular control over their content.
Give Periodic Notifications to Update Privacy Settings
One simple idea is that say every three months, the phone prompts you with a notification in your notification bar:
This would allow users to periodically be reminded that things like automatic sync are on and that they have the option of turning them off. The user is free to ignore it, but it would give them at least a reminder that "Hey, I'm sending your stuff to the cloud, are you OK with that? Do you want to review your settings?"
Make Synchronization Explicit
One of the problems I have with Google Photos is that it's all or nothing by default. There isn't a middle ground that allows me to sync some of my photos as a default.
The user experience paradigm here would be much like that of Facebook where you can post photos by selecting them from your album to explicitly and with fine grain control what gets sent to the cloud. Likewise, iCloud and Google Photos would do well to allow a middle ground that gives users more fine grained control over what gets sent to the cloud instead of ON and OFF.
In discussions, some have said that this would present too high a burden on end users, but it seems to work fine for Facebook and I think that it would be relatively easy to implement in an easy to use manner:
If the user selects "Sync All", then all 20 new photos are synced to the cloud (be that iCloud, Dropbox, Google Drive, etc). If the user selects "Choose", the user is given a screen that allows the user to explicitly pick the ones to sync. The pick screen should prompt the user to "Ignore unselected items for future backup?" when selection is complete so that any unselected photos are simply ignored next time. If the user selects "Don't Sync", then do nothing.
A simple design like this still gives the user access to the convenience of cloud backups while giving them explicit, fine-grained control and acknowledgement that their data will be stored in the cloud.
The victim shaming is simply not warranted; whether these individuals should or should not have taken these compromising photos and videos is not the right question to ask. The right question to ask is whether Apple or Google should be automatically syncing them to a resident cloud storage without finer grained controls and explicit consent.
The principle is that any time you -- as it were -- voluntarily let up control.
In other words, cease to cling to yourself; you have an excess of power because you are wasting energy all the time in self defense.
Trying to manage things, trying to force things to conform to your will.
The moment you stop doing that, that wasted energy is available.
Therefore you are in that sense -- having that energy available -- you are one with the divine principle; you have the energy.
When you are trying, however, to act as if you were god, that is to say you don't trust anybody and you are the dictator and you have to keep everybody in line you lose the divine energy because what you are simply doing is defending yourself.
One mistake that I've been guilty of is to try to force things to conform to my will on various projects (I still do it to varying degrees!). It is usually with the best of intentions -- for a cleaner framework, a better product, a more efficient process -- but at the same time, it is true that a lot of energy is spent wasted in doing so.
What is the alternative, then?
I think Watts is right that a level of trust has to exist that the team around you can help you achieve your project goals. Instead of expending the energy in controlling the members of the team, spend the energy in building that trust through training, mentorship, guidance, and giving up not just control, but responsibility.
Sometimes that trust will be unwarranted, but sometimes, that trust will pay itself back many-fold.
Another finding was that the perception of heat could affect the participants’ emotions. The research team said that this is because emotions are more intense under bright light; thus, leading to the perception that light is heat, which can trigger more intense emotions.
The team also found that bright light affects the kinds of decisions people make. Since the majority of people work during the day under bright lighting conditions, the researchers noted that most daily decisions are made under bright light, which intensifies emotions.
Accordingly, they suggest that turning the light lower may help people make more rational decisions, not to mention negotiate better settlements in a calmer manner.
Taking emotion out of decision making is one of the most important skills one can develop and maybe it can be as simple as installing more ambient lighting and turning off those (ugly) fluorescent lamps overhead!
From Peopleware, 3rd Edition:
The propensity to lead without being given the authority to do so is what, in organizations, distinguishes people that can innovate and break free of the constraints that limit their competitors. Innovation is all about leadership, and leadership is all about innovation. The rarity of one is the direct result of the rarity of the other. (p. 101)