Brandon Donnelly
Daily insights for city builders. Published since 2013 by Toronto-based real estate developer Brandon Donnelly.
Brandon Donnelly
Daily insights for city builders. Published since 2013 by Toronto-based real estate developer Brandon Donnelly.

The word on the street is that Sonder -- the marketplace for vacation rentals and competitor to Airbnb -- is close to finalizing a $200 million investment round that would value the company at $1 billion.
I first wrote about Sonder back in 2016 after I met someone from their business development team here in Toronto. I have yet to stay in a Sonder, but I've looked at their rentals a few times.
One of the main differences between Sonder and Airbnb is that the former head leases their rental supply. And they do this by trying to go higher up on the food chain and partner with developers and real estate operators.
In this regard, they are similar to WeWork. And it allows them to sit somewhere in between Airbnb and a conventional hotel. The supply is distributed, but the service offering is more consistent.
Of course, this arguably makes their business model slower (they have to negotiate leases) and more costly (they're committing to fixed costs). So it becomes a question of: How valuable is that consistent service offering?
Lately when I travel, I've been trending more toward hotels, as opposed to Airbnb-like rentals. I like the experiences that many hotels are now focused on creating and I like knowing that if my flight arrives late (in a place like Brazil), I'll be able to get into my room.
I guess consistency does matter.
Photo by Spencer Watson on Unsplash
This past Monday, Tesla held an event for its investors called "Autonomy Day." It was livestreamed, but if you missed it, here's the video. It's almost 4 hours long, though the first hour is just footage of Tesla vehicles driving around. I'm assuming it was background content.
https://youtu.be/Ucp0TTmvqOE
I'll be honest in that I haven't watched it all. But there's a lot here if you want to get into the inner workings of how their self-driving cars work. Musk also promises, at the event, that Tesla will have level 5 autonomy ready by the middle of next year (2020). At that level, you will no longer need to pay attention to the road as a driver.
Along with this autonomy, the company plans to start rolling out "robotaxis" and a ride-hailing app that will allow owners to rent out their cars. Musk is predicting that this could generate upwards of $30,000 in profit per year for owners. Of course, at this point, nobody really believes any of these promises. Musk is notorious for overselling.
But let's imagine that robotaxis are the future. Maybe it won't happen by the middle of 2020. But it will happen at some point.
If taxis are automated machines that drive people around all day and then go and park somewhere during off-peak times, where do they want to go and park? Does autonomy all of a sudden disconnect the locations of owners and parking, because your car will simply come to you when you need it?
And what do these feature mean for parking supply? Presumably (and we have talked about this before on this blog), you need less parking and it wants to be in locations where the real estate values are less. But because of this, I bet that we're going to need to start -- and get really good at -- pricing road usage.
What are your thoughts?


Machine learning is one of the most important trends in tech right now. But like anything new, it naturally raises a number of important questions and concerns. Benedict Evan's most recent blog post provides a good explanation of what he refers to as the artificial intelligence bias. Here are a couple of excerpts that I found interesting.
What machine learning does:
With machine learning, we don’t use hand-written rules to recognise X or Y. Instead, we take a thousand examples of X and a thousand examples of Y, and we get the computer to build a model based on statistical analysis of those examples. Then we can give that model a new data point and it says, with a given degree of accuracy, whether it fits example set X or example set Y. Machine learning uses data to generate a model, rather than a human being writing the model. This produces startlingly good results, particularly for recognition or pattern-finding problems, and this is the reason why the whole tech industry is being remade around machine learning.

The word on the street is that Sonder -- the marketplace for vacation rentals and competitor to Airbnb -- is close to finalizing a $200 million investment round that would value the company at $1 billion.
I first wrote about Sonder back in 2016 after I met someone from their business development team here in Toronto. I have yet to stay in a Sonder, but I've looked at their rentals a few times.
One of the main differences between Sonder and Airbnb is that the former head leases their rental supply. And they do this by trying to go higher up on the food chain and partner with developers and real estate operators.
In this regard, they are similar to WeWork. And it allows them to sit somewhere in between Airbnb and a conventional hotel. The supply is distributed, but the service offering is more consistent.
Of course, this arguably makes their business model slower (they have to negotiate leases) and more costly (they're committing to fixed costs). So it becomes a question of: How valuable is that consistent service offering?
Lately when I travel, I've been trending more toward hotels, as opposed to Airbnb-like rentals. I like the experiences that many hotels are now focused on creating and I like knowing that if my flight arrives late (in a place like Brazil), I'll be able to get into my room.
I guess consistency does matter.
Photo by Spencer Watson on Unsplash
This past Monday, Tesla held an event for its investors called "Autonomy Day." It was livestreamed, but if you missed it, here's the video. It's almost 4 hours long, though the first hour is just footage of Tesla vehicles driving around. I'm assuming it was background content.
https://youtu.be/Ucp0TTmvqOE
I'll be honest in that I haven't watched it all. But there's a lot here if you want to get into the inner workings of how their self-driving cars work. Musk also promises, at the event, that Tesla will have level 5 autonomy ready by the middle of next year (2020). At that level, you will no longer need to pay attention to the road as a driver.
Along with this autonomy, the company plans to start rolling out "robotaxis" and a ride-hailing app that will allow owners to rent out their cars. Musk is predicting that this could generate upwards of $30,000 in profit per year for owners. Of course, at this point, nobody really believes any of these promises. Musk is notorious for overselling.
But let's imagine that robotaxis are the future. Maybe it won't happen by the middle of 2020. But it will happen at some point.
If taxis are automated machines that drive people around all day and then go and park somewhere during off-peak times, where do they want to go and park? Does autonomy all of a sudden disconnect the locations of owners and parking, because your car will simply come to you when you need it?
And what do these feature mean for parking supply? Presumably (and we have talked about this before on this blog), you need less parking and it wants to be in locations where the real estate values are less. But because of this, I bet that we're going to need to start -- and get really good at -- pricing road usage.
What are your thoughts?


Machine learning is one of the most important trends in tech right now. But like anything new, it naturally raises a number of important questions and concerns. Benedict Evan's most recent blog post provides a good explanation of what he refers to as the artificial intelligence bias. Here are a couple of excerpts that I found interesting.
What machine learning does:
With machine learning, we don’t use hand-written rules to recognise X or Y. Instead, we take a thousand examples of X and a thousand examples of Y, and we get the computer to build a model based on statistical analysis of those examples. Then we can give that model a new data point and it says, with a given degree of accuracy, whether it fits example set X or example set Y. Machine learning uses data to generate a model, rather than a human being writing the model. This produces startlingly good results, particularly for recognition or pattern-finding problems, and this is the reason why the whole tech industry is being remade around machine learning.
The rub:
However, there’s a catch. In the real world, your thousand (or hundred thousand, or million) examples of X and Y also contain A, B, J, L, O, R, and P. Those may not be evenly distributed, and they may be prominent enough that the system pays more attention to L and R than it does to X.
What AI isn't:
I often think that the term ‘artificial intelligence’ is deeply unhelpful in conversations like this. It creates the largely false impression that we have actually created, well, intelligence - that we are somehow on a path to HAL 9000 or Skynet - towards something that actually understands. We aren’t.
The conclusion:
Hence, it is completely false to say that ‘AI is maths, so it cannot be biased’. But it is equally false to say that ML is ‘inherently biased’. ML finds patterns in data - what patterns depends on the data, and the data is up to us, and what we do with it is up to us. Machine learning is much better at doing certain things than people, just as a dog is much better at finding drugs than people, but you wouldn’t convict someone on a dog’s evidence. And dogs are much more intelligent than any machine learning.
Photo by Ales Nesetril on Unsplash
The rub:
However, there’s a catch. In the real world, your thousand (or hundred thousand, or million) examples of X and Y also contain A, B, J, L, O, R, and P. Those may not be evenly distributed, and they may be prominent enough that the system pays more attention to L and R than it does to X.
What AI isn't:
I often think that the term ‘artificial intelligence’ is deeply unhelpful in conversations like this. It creates the largely false impression that we have actually created, well, intelligence - that we are somehow on a path to HAL 9000 or Skynet - towards something that actually understands. We aren’t.
The conclusion:
Hence, it is completely false to say that ‘AI is maths, so it cannot be biased’. But it is equally false to say that ML is ‘inherently biased’. ML finds patterns in data - what patterns depends on the data, and the data is up to us, and what we do with it is up to us. Machine learning is much better at doing certain things than people, just as a dog is much better at finding drugs than people, but you wouldn’t convict someone on a dog’s evidence. And dogs are much more intelligent than any machine learning.
Photo by Ales Nesetril on Unsplash
Share Dialog
Share Dialog
Share Dialog
Share Dialog
Share Dialog
Share Dialog