B School for Public Policy
Inequality in the Gig Economy
Gig economy jobs have soared in recent years, but as the consumer receives a quick service or a same-day delivery product, what are the workers getting? When this type of work, including ride-hailing services and ice cream delivery came along, some thought women would benefit greatly. But data shows that this hasn’t occurred yet, as inequality is a growing component of the global workforce. There has been an “Uberization” of what the gig economy is today, and those jobs are mainly done by men which has left the discussion about women in this sector primarily on the side.
An edited transcript of the conversation follows.
Knowledge@Wharton: A report by Julia Ticona, Assistant Professor in the Annenberg School of Communication at the University of Pennsylvania, takes a look at these issues. The report is titled “Beyond Disruption: How Tech Shapes Labor Across Domestic Work and Ridehailing.” It’s a pleasure to have Julia joining us here in-studio. Nice to meet you.
Julia Ticona: Thank you so much, Dan. I appreciate it.
Knowledge@Wharton: Thank you. Let’s dig into the research and give us a backstory about how you accomplished this, and what you were looking for.
Julia Ticona: Absolutely. My co-authors and I, Alexandra Mateescu and Alex Rosenblat, who are both colleagues of mine at the Data & Society Research Institute, which is where I was a post-doc as I did this research. We noticed in 2016 that women were being completely left out of this media coverage and business knowledge and understanding about what was happening in the gig economy and in the future-of-work conversations in general, which was really disturbing to us – especially because at the time the Pew Research Centers put out a report in 2016 that showed that 55% of the people who actually find paid work that they do through these online labor platforms were women.
And I’m a sociologist by training, and when you see 55% of women doing something at the same rate as men, right? – so basically men and women doing something at roughly equal rates – it’s pretty remarkable in the world of work. Because oftentimes with men and women, our labor is very “gender-segregated,” as they say.
That prompted us to say, “We need to be paying more attention to what women are doing with these apps on these platforms.” Along with my two co-authors, we interviewed ridehailing drivers. These are folks who are working for Uber and Lyft. And we compared those folks to people who were finding work through domestic work apps. These are work apps like TaskRabbit and Handy, and also care work apps like Care.com, UrbanSitter, and Sittercity.
Knowledge@Wharton: Why do you think, then, there was this dynamic in place where there was more attention given to the men in this workplace or this sector than there was to the women?
Julia Ticona: I think it’s Uber, honestly. Uber has, as a company, pursued a kind of public relations strategy that is like disaster governance, right? They’re just governing by scandal, or they’re using all of these scandals around their platforms to give themselves more public relations, good or bad. I think the company has changed tactics now, thankfully. But at the time that we were starting this report, and also if you think about around 2014 – which is where we saw this peak Uber coverage – you know, you couldn’t open The Wall Street Journal, The New York Times, any kind of business publication without seeing something about the latest scandal or something that Uber was doing.
And at that same time, the platforms where these women are finding work – I’m not saying “employ,” because they are not employers – where they’re workers, were not pursuing that kind of public relations strategy in large part because the kind of work that women do through these apps requires a degree of trust and this kind of solidarity and care with the clients that they were meeting that just doesn’t jive with that kind of a public relations strategy, right?
And so I really think that they were being left out of these major publications because of the way that these companies are covered in the media.
Knowledge@Wharton: What was the impact, then, from this type of an approach playing out in the gig economy, especially with some of these other areas of it? When we think “gig economy,” we first obviously think about ridesharing. But with the care positions, what kind of impact were you seeing?
Julia Ticona: In terms of what was happening with the platforms on the –
Knowledge@Wharton: Correct. And the impact on the worker.
Julia Ticona: What we’ve seen overall is really that these platforms have created a multiplying effect of the ways that domestic workers look for work nowadays. And so domestic workers have always been in this kind of informal and gray part of the labor market, right? A lot of domestic work happens under the table. A lot of these folks are undocumented immigrants, right? That’s absolutely on purpose that that is happening that way. But these workers have always found work. It has always been necessary for them to look for work through five or six different means all at the same time.
They use agencies. They look for work online through things like Craigslist, on Facebook. They use word-of-mouth networking and they find each other jobs through friend networks and other people that they’ve worked with at past jobs.
And what we’ve seen with the entrance of platforms into this economy is that this has added a whole other layer, a whole other consideration that these workers have to take into account when they’re doing their job search. It’s basically adding more unpaid labor.
Knowledge@Wharton: But in many cases, these are apps and sites that are determining a lot of this information through algorithms. How does the use of the algorithm have an impact on this, as well?
Julia Ticona: Yes, that’s a great question. What we found is the way that algorithms function in this environment is a little bit different from the way that they function in ridehailing and in some of these other more familiar platform environments. What we found was that these care work companies, much like Uber and Lyft, claim that they’re democratizing work, right? They’re bringing down the barriers, they’re making it so that anybody and everybody who wants to care for a kid or an elderly person can come in – as long as they pass a background check – to do that.
And where we come from as a critical social science perspective on this is saying instead of taking that company line at face value, what we wanted to look at is when you democratize something, you’re not necessarily just removing, destroying barriers. But those barriers move around. And so what we wanted to understand is, okay – what are the new barriers? How are those barriers being shifted, and who are they maybe disproportionately affecting?
And what we found is that algorithms function differently on these care websites, because they don’t monitor and track the execution of work tasks. So as an Uber driver – the algorithm is tracking, we believe, the speed of your car, how you brake, all these different things. With care, they’re not trying to track how quickly you bathe a child or how effectively you deliver that snack to them – yet, at least – they’re not trying to track those things. But what they are doing is they’re trying to match clients and workers together in a very effective way. And so what we found the effect of this is that there is an immense pressure on workers to brand themselves, to present themselves in a way that makes them seem trustworthy and employable and recognizable to these algorithms.
So whether or not you’re actually trustworthy and employable, which is a whole separate question, the algorithms prioritize different things in making workers present themselves in particular ways.
Knowledge@Wharton: So then are we talking about a necessary change that needs to occur within either the thought process with these companies or the actual algorithm, so that you don’t have to perceive yourself as being trustworthy? You just are trustworthy on your persona.
Julia Ticona: Yes, that’s a good question. I think there are a bunch of changes that companies could probably make to lessen the pressures of some of these environments that they’ve created for workers. But I see my role as a researcher as really bringing these experiences to the surface, and really saying that when we ignore women’s experiences in this part of the labor market, we completely miss the importance of these experiences and kind of leave it to the companies, leave it to the policy researchers to really use their expertise and make those decisions about how those changes should be made.
Knowledge@Wharton: But are the companies aware of them? You mentioned Care.com being one of them. Is Care.com aware of some of these issues and understanding that they recognize this and that maybe they need to make change or they need to tweak their process moving forward?
Julia Ticona: Yes, I believe so. I tend to come at this research from a place of good faith, where I really believe that these companies are trying to do the right thing. I think they have bottom lines. I think they need to make money for their shareholders. Care.com is a publicly traded company. It has been a publicly traded company for many, many years, as opposed to a company like Uber that just went public a couple of months ago. And I think they’re always trying to balance those priorities, as any kind of business does.
I think because they haven’t had the kind of public scrutiny that some of these other companies have had, that maybe they’ve been able to fly under the radar a little bit more. And I do see a part of my role as to kind of bring that out and to say, “Hey, we should be looking at these technologies that are affecting a huge number of workers.” Care.com has 11.5 million workers who are registered to work on their platform in the United States alone. And this is an international platform, right? That is a huge amount of Americans or people who are living in this country who are affected by these technologies. And so I think it’s time to publicly recognize these inequalities.
Knowledge@Wharton: But then I would imagine there also has to be a concern around data privacy with this, as well. Because if you have 11.5 million people registered, and probably not all are active at one particular time, you’re still talking about a lot of personal data that is in the Care.com system and obviously in the internet that will benefit people that would want to have a caregiver, but also is a potential touch point for a hacker, for somebody looking to steal information.
Julia Ticona: Yes, absolutely. You’re right on target. There are two ways that this is really important. The first is that one of the things that workers really told us as being concerning about the platform is their sense that what – to put it colloquially – “What happens on Care.com doesn’t stay on Care.com.” Right? It just kind of migrates all over the internet, because these are networked media, so they’re not these kind of walled gardens where things just kind of stay within one site.
So as an example, on Care.com when you create a profile, the default is that that profile is public. And what that means is that all of this personal information – and workers are encouraged to post profile pictures, biographical descriptions about themselves and their work history, their education history, their weekly schedule availability. They are encouraged to connect their social media accounts – any of these ways that you can make yourself individually visible are publicly available, visible on the internet.
And so you or I or anybody without a Care.com account can Google “nanny” and a person’s name. Or “elder care worker” and maybe a city, and all of that information comes up through Google. So data privacy in this context is extremely important, especially for this workforce, who we know are overwhelmingly women of color and has a high proportion of undocumented folks in that industry.
Knowledge@Wharton: You have also talked about the fact that there’s an element of this where these workers are also to a degree treated like consumers in this process, as well, which I think when you look at that specifically, you want to have a differentiation between worker and consumer, but maybe that wall is not there.
Julia Ticona: Yes, so this is a line where Care.com and these other care platforms are actually similar to the ridehailing platforms. And my colleague Alex Rosenblat has written about this in her book called Uberland, which if you’re interested in this, you should definitely check out, where there’s this kind of blurring of the boundaries with language. This is where language and culture become really important, where these companies will refer to all of their users – so meaning people who are requesting rides, people who are giving the rides in their cars – all as “users” of their platform.
And these care websites actually do exactly the same thing. And where we’ve seen this is in their handling of scams. So something that a lot of people might not know if they haven’t used these websites is that scams are really rampant on these websites. So care workers are being constantly bombarded with fake jobs, where they’re being offered exorbitant amounts of money to care for children that don’t really exist. And they’re essentially fake check scams, so they’re not very sophisticated scams. But what they do is they basically prey on newcomers, and those nannies are then solicited to send a check to an address, and then they lose that money.
And so what we see when the care platforms are trying to protect workers against this, are trying to inform them about these scams, is that they tell workers that if they suspect a scam is happening, that they should obviously tell the platform. They can flag an account. But they should also report these people to the FTC – the Federal Trade Commission – which is the federal agency that regulates consumer issues, and not to the Bureau of Labor or something like that. And so we can see this kind of slippage where workers are being regulated as consumers, or are being kind of encouraged to pursue those.
Knowledge@Wharton: And I think that’s part of a bigger question which we’ve talked a lot about on other topics in this show of whether or not there is a true understanding by the U.S. government, by agencies within the U.S. government of the potential pitfalls that are here, this being another instance. We’ve talked about what’s happened with Facebook and the Equifax scandal, as well. But now you’re talking about the potential of another entity that could very well be in this same type of realm because of the data, because of whatever the level for data security that might be occurring with these companies.
Julia Ticona: Yes, and actually the news about the FTC and the Facebook fine that was just recently levied is actually very hopeful to me. I think, again, I’m not on the policy side of things. I like to stay on the research side. I leave it to the policy folks to decide what the best strategy is as far as where the political momentum is and what they’re able to do politically to regulate these companies. But I think there’s a case to be made that when you look at different types of platforms – social media platforms, labor platforms – there are a lot of similarities there in terms of the kind of power and the kind of data that they have access to that might require similar kinds of regulation.
Knowledge@Wharton: I mentioned the impact on the people who are working in these areas, but what about the impact on the companies themselves?
Julia Ticona: The impact of…?
Knowledge@Wharton: Well, they’re obviously potentially looking at Facebook, looking at a huge fine, looking at a massive loss. But then from the bottom line, if it’s a publicly traded company, they’re looking at a negative financial impact on their bottom line, as well.
Julia Ticona: Yes, absolutely. I would love to learn more about that. I would love to get the good folks at Wharton to explain how those business dynamics work to me.
Knowledge@Wharton: How do you think, then, the inequality needs to be better addressed?
Julia Ticona: Yeah, I think there needs to be a much broader conversation, a much broader acknowledgement of the importance of data protection as worker protection. And that when we’re talking about workers’ rights, when we’re talking about things like equal pay, non-discrimination against different kinds of minority groups – all these kinds of worker protection – even when you think about things like basic benefits, like access to insurance and other kinds of workplace protection. I think when we think about workplace protection for the 21st century, data protection has to be a part of that conversation. And I think in that way, studying what might seem as a kind of esoteric or kind of weird case – these domestic workers on labor platforms might not seem like the most relevant group of workers to be studying – but these folks are, I believe, for lack of a better phrase, “canaries in the coal mine” for pressures, for things that are going to be affecting all of us as American workers in the years to come.
And so I think it’s in our own interests, not just in the interests of this particular group of workers, that we pay attention to the kinds of inequalities that they’re facing, because they’re soon going to be facing all of us.
Knowledge@Wharton: Are you optimistic that we can move in that direction? There are, obviously, a lot of concerns out there that we get to a certain point, and then companies are not willing to go beyond that point, to take that step 3, 4, and 5 in the process.
Julia Ticona: Yes, that’s a great question. I think about that professionally a lot. I don’t think I would be doing the kind of research that I do if I wasn’t an optimist about this stuff, because otherwise it would just be too depressing to continue doing this research. And I plan on it. You know, I tend to be a very optimistic person, and I think that when advocates – when policy-makers, when politicians – have sufficient pressure on them and are also able to figure out ways to make the financial case to companies and to really say, “Hey, look – if you at all,” meaning the companies, “want to ensure that you have a diverse workforce, that you have a platform that’s inclusive, that you really are ensuring economic opportunity for all, which is a part of the platform promise to workers in the United States, you’ve got to address these issues.” And that eventually when you’re not, there’s going to be consumer pressure.
Knowledge@Wharton: But I would think there would also be hope that you would be able to have that type of a conversation. And obviously the government piece to it, I think, is important. But if you can have that conversation between the workers and the companies themselves, and if those two parts have a recognition of it, that maybe they can work together, it may be company by company by company – but at least you’re moving in the right direction.
Julia Ticona: Yes. I hope that through reports like this that perhaps the workers that we’ve interviewed and some of the worker advocates that we’ve been working with through the process – that they are able to gather that strength, gather than confidence that their stories are not unique – that these are not individual problems, that these are shared social issues that they are systematically seeing within their communities.
Knowledge@Wharton: Is there a next step in the process from a researcher’s perspective? Is there a next natural step for you playing off of this research to take this even further?
Julia Ticona: Yes, sure. I am really interested in getting parents’ perspectives, or the folks who are looking for care through these platforms. And that’s going to be the next population that I am really interested to understand how they actually see these interfaces, the ways that they’re actually interpreting these messages from the company, instead of again, taking those messages at face value.
Knowledge@Wharton: Great meeting you. Thank you for coming in.
Julia Ticona: You too. Thank you so much.
Knowledge@Wharton: Nice to meet you. Julia Ticona, who is with the Annenberg School of Communication, an Assistant Professor there, here at the University of Pennsylvania.