

Fostering IMPACT
FACULTY IMPACT REPORT 2025
A Letter from the Dean
The David Eccles School of Business is embarking on our next ascent — one defined by ambition and a commitment to shaping the future. At the core of this effort, and essential to advancing our priorities of driving student success, creating societal impact, and building an enduring legacy, is our world-class faculty.
Our faculty are not only outstanding educators, but also renowned thought leaders. Their research creates best practices, informs policy, and helps people, businesses, and economies thrive.
In this summary, you will see examples of that impact: from how early exposure to artificial intelligence can hinder long-term learning and problem-solving skills, to how political polarization is shaping investment behavior; from how the red tape of public assistance programs can keep families from accessing critical support, to how COVID-19 spurred a corporate reckoning on employee mental health and well-being.
The Eccles School will continue to transform business education through high-impact learning, stronger industry partnerships, and scholarship that elevates both visibility and influence. I am grateful for the dedication of our faculty, whose work inspires the next generation of business leaders and strengthens the communities and economies we serve.

Kurt Dirks Dean, David Eccles School of Business



TAKE A CLOSER LOOK AT OUR SKILL AND TALENT
DAVID ECCLES SCHOOL OF BUSINESS
SCHOOL OF ACCOUNTING
Xiaoxia Peng
The COVID-19 global pandemic proved a catalyst for seismic change across so many facets of society. The business landscape was particularly impacted, with supply chain disruptions, cash-flow and liquidity issues, and workforce volatility combining to wreak havoc.
Such problems were frequently destructive to companies’ bottom lines, prompting drastic restructurings that often were manifested in the form of furloughs, layoffs, and slashes to various remuneration policies.
Xiaoxia Peng was particularly intrigued to study whether and why companies also chose to cut the salaries of their top executives, especially Chief Executive Officers (CEOs). She and her coauthors wound up with a detailed look at what drove such decisions.
“Rather than relying on annual compensation data from commercial databases, we manually reviewed SEC filings including quarterly reports [form 10-Q] and press releases [form 8-K] to identify firms that explicitly announced pandemic-related salary cuts for CEOs,” said Peng. “This approach allows us to isolate deliberate actions taken by boards to reduce CEO pay, rather than mechanical declines tied to formulaic performance-based compensation. It also allows us to precisely measure the timing of those decisions and examine how the market responds to them.”
Their research enables us to understand how big companies try to balance saving money, keeping investors happy, and treating employees fairly — especially in uniquely difficult circumstances.
The study looked at nearly 500 companies that cut CEO salaries and
compared them to more than 3,000 that didn’t. It shows that cutting executive pay can be a smart public relations move, even if it doesn’t always mean those executives actually take a hit to their wallets.
Among the key findings are that companies were more likely to institute such cuts if they had high pay gaps between the CEO and average workers, had stronger company oversight (such as more independent or female board members), were already performing poorly before the pandemic, did not have much in the way of cash reserves, and were laying off employees.
There were three primary motivations for companies that cut CEO pay — to save money; to show solidarity with workers, particularly if layoffs were enacted; and to avoid criticism and/or bad publicity.
“When examining market reactions to the announcement of CEO pay cuts, we find that investors respond more positively when firms with higher CEO-to-employee pay ratios announce cuts, consistent with the notion that such cuts help legitimize other tough decisions made by those firms in managing the pandemic,” Peng said.
Interestingly, their research found that firms which acted as “first movers” among their peer group in announcing cuts frequently got the benefit of external positive reaction, as such decisions were perceived as sending a clear signal of accountability and leadership. In actuality, the research demonstrated that these cuts merely tended to come from companies with weaker performance or higher leverage.
Meanwhile, one of the researchers’ most intriguing findings revealed
that announcements of cuts to CEOs’ salaries often didn’t actually yield an overall reduction in compensation.
Our findings suggest that when facing crises, boards should consider not only economic factors, but also how CEO pay decisions align with broader workforce actions.”
Many companies changed their approach to how bonuses were measured, shifting away from profit-based targets that were difficult to reach during the pandemic and instead granting equity compensation that began modestly (as stock prices were low early in the pandemic), but which later appreciated significantly in value.
In this way, boards were able to save cash in the early days of the pandemic by using longer-term equity pay to restore incentives.
So, what’s the ultimate takeaway? What can current or future companies learn from this research?
“Our findings suggest that when facing crises, boards should consider not only economic factors, but also how CEO pay decisions align with broader workforce actions, such as layoffs or furloughs,” Peng said. “Firms that take a proactive and visible approach, especially by being a first mover, appear to be rewarded by shareholders.”

DEPARTMENT OF ENTREPRENEURSHIP & STRATEGY
Colleen Cunningham
One of the core principles of business is that extended success is dependent upon constant innovation. Historically, one surefire way to keep tabs on new trends and technology has been tracking patents for intellectual property. Firms also use trade secrets to protect their intellectual property; in 2016, a federal law strengthened the protection of trade secrets across the U.S.
As Colleen Cunningham notes, trade secrets themselves are not new — “Zildjian cymbals, the formulation for the alloy in those, that’s been held since the Ottoman Empire” — and neither are protections for them. While originally based on common law, there became a series of state-specific laws passed from the 1980s onwards were based off the Uniform Trade Secrets Act, although, as Cunningham pointed out, they are not entirely uniform in the level of protection they provide.
Then, in 2016, came the passage of the Defend Trade Secrets Act (DTSA), which created a federal jurisdiction for bringing forward claims of misappropriation of trade secrets. Cunningham and her research collaborator wanted to investigate how the law changed the ways that companies used trade secrets. They chose to specifically focus upon DTSA’s impact within the hydraulic fracturing industry (fracking) as a microcosm.
It’s important to first understand the pros and cons of whether to utilize patents vs. trade secrets to protect intellectual property. At their most basic, patents provide a legal monopoly for a set period (typically about 20 years) but require public disclosure, while trade secrets offer indefinite protection but rely on secrecy and can be vulnerable if independently discovered or reverse-engineered.
Because trade secrets are, well, secret, their usage can be difficult to observe, let alone actually study. Surveys of firms are one means of trying to
ascertain how widespread and effective they are. Companies’ 10-K regulatory filings to the Securities and Exchange Commission (SEC) were also utilized. But these could only get them so far.
Such challenges were actually a significant factor in the researchers’ choice to focus on fracking. Cunningham’s research partner had long been studying fracking, and possessed ingredient-level data for wells; various known chemicals would be listed under purpose categories, but in several instances, “trade secret” would appear in lieu of a specific chemical in the fracking fluid’s formula. Most states with fracking activity require fracking firms to disclose the ingredients they use, with allowances to omit the names of trade secret ingredients. This gave them a tangible, real-world example to extrapolate from.
From there, the researchers reviewed state governments’ fracking-fluid ingredient disclosure requirements, and gathered data from and measured the productivity of nearly 48,000 fracked wells. These methods enabled them to gain some degree of scope regarding the proliferation of trade secrets, as well as their efficacy.
The study provides the first evidence regarding the effects of trade secret policy on secrecy-protected inventive activity. The authors found wells using secret ingredients usually produced more oil and gas, meaning the secrets had real value. After DTSA, the use of trade secrets increased substantially and further, firms appeared to be generating novel trade secret ingredients and recipes.
The team also investigated whether fracking companies were merely trying to take advantage of the secrecy afforded them as a means of cloaking additional toxic chemicals in their formulas. To do so, they linked each disclosed ingredient’s “chemical abstract service number” to an
Environmental Protection Agency (EPA) database on toxicity levels and tracked the number of lawsuits brought by environmental groups and other similar stakeholders. There was no evidence of additional toxicity associated with the policy or increased use of trade secrets. Further, the team found no evidence of decreases in patenting associated with the DTSA.
“From a research perspective, there was this idea that increasing secrecy protection might decrease innovation — and our findings are not consistent with that,” Cunningham said. “It seems like increased protection is actually beneficial for innovation — with the caveat that this is one industry.”
It seems like increased protection is actually beneficial for innovation.”

Cunningham concedes that the very nature of secrets all but guarantees that the generalizability of the results will be frustratingly ambiguous. And yet, there was enough evidence from the study to conclude that the trickle-down effect of the DTSA law is that companies used trade secrets more often, especially in states where they weren’t strongly protected before.
“Beyond fracking, where firms are required to disclose and substantiate their use of trade secret ingredients, it’s very hard to answer this fairly straightforward question of: ‘Does the increase in the strength of trade secret policy actually have any material effect on the use of trade secrets and whether or not firms come up with new ones?’” she said. “…I know this is an unsatisfactory academic thing, but more evidence is needed. What our study at least does is say: ‘Patenting isn’t everything. And when these policies come into play, we see strong, robust evidence for increases in trade secret use and increases in secrecyprotected inventive activity.’”
DEPARTMENT OF FINANCE
Yihui Pan
It’s no secret that the United States has seen a dramatic spike in political polarization over the past few decades. What might be surprising though, is that the modern divide between red and blue is reflected not only in voting trends, policy decisions, and social issues — it has a significant impact in how people invest their money, too.
Yihui Pan and her coauthors were looking for innovative ways to track the trend in an empirical and data-driven way. They developed a simple conceptual framework to uncover Partisan Portfolio Disagreement (PPD), which captures the extent to which Democratic- and Republicanleaning investors hold consistently different equity portfolios.
They conducted a literary review, and while they found that prior research conclusively linked a preferred political party’s control of the government to confidence in the economy and oppositional control with investing in “safer” assets, they could not find any research investigating links between political persuasions and the selection of individual stocks.
So they did it themselves, with one basic question underpinning their work: Do Republicans and Democrats invest differently in the stock market, and how has such investment behavior evolved over time?
Their study used data from more than 1,600 independent portfolio investment advisers in more than 300 U.S. counties spanning the years 2001-19, with a focus on high-income investors (the average account size was $4.6 million, and these accounts covered $1 trillion in assets by 2019 and represented 3% of the U.S. equity market).
The team also tracked the voting histories of these counties, self-
reported political affiliations (from the Gallup survey), and studied the effect of Sinclair Broadcast Group (a conservative news network) entering new markets during this span.
Upon assembling this cache of data, the researchers created PPD to assesses how people’s political beliefs affect the way they invest in the stock market. What they found is that more and more stocks are becoming “partisan,” with evidence to show that Republicans and Democrats invest very differently, and that the trend grew significantly during that timeframe.
In 2001, right- and left-leaning investors’ portfolios were exceedingly similar, with a Partisan Portfolio Disagreement of just 1.3%. But by 2019, the PPD had grown to about 20% — meaning onefifth of the average investor’s portfolio differed depending on their political leanings.
Their paper noted “the largest increases in partisan disagreement occurring in consumer goods industries” (such as cars or clothing brands) “and industries with significant environmental footprints” (like oil and mining). “We further show that the increase in PPD is more pronounced for stocks subject to increasing ideological polarization on environmental and social issues and growing dislike of the opposing party.”
For example, Democrat investors tended to avoid companies with environmental or labor issues while embracing “green” technology, whereas Republicans generally cared less about those issues in their investment choices.
Furthermore, their research established that when Sinclair entered a market, Republican voting increased in that county — and so did the area’s conservative-style investing, which suggests that media and political beliefs indeed can change how people invest.
It’s human nature to invest in the things you care about, but you also need to keep an open mind.”
The potential side effect is that such investment behavior could affect how companies behave or push them to take political sides depending on who their investors are.
Pan also sees a practical problem so basic as to be Finance 101.
“We always say diversification is important because you can hedge away idiosyncratic stock risks. In this context, diversification may be — from a societal point of view — even more important because we don’t want a vicious circle of just reinforcing the same ideologies again and again,” she said. “… Yes, it’s human nature to invest in the things you care about, but you also need to keep an open mind.
“Companies also need to be mindful, because the lack of diversification could lead to higher costs of capital,” she added. “If you don’t have a diversified base of investors, this will lead to limited risk-sharing and will increase cost of capital from a firm. It’s more beneficial to broaden the investor base.”

DAVID ECCLES
DEPARTMENT OF MANAGEMENT
Glen Kreiner
Although COVID-19 no longer dominates the zeitgeist the way it did five years ago, its impact lives on today in many of our social structures. For instance, mental health in the workplace simply was not a subject most employers gave much if any thought to before the coronavirus pandemic occurred. But with social distancing and work-from-home policies becoming increasingly commonplace and sometimes mandated, the topic became unavoidable.
The pandemic helped to normalize conversations around mental health that didn’t exist five-plus years ago, and made discussions about mental health a part of our regular day-to-day discourse. Indeed, many executives have recognized over the long term the need to keep it on their radar.
“This became a national conversation in a way that it wasn’t before COVID, and business leaders had no other choice than to address the mental health issues of their workplace,” said Glen Kreiner. “We’ve seen many more organizations being overt about approaching mental health for their workers, supervisors, managers, and leaders. We see a lot more organizations now focusing on mental health than we did before.”
Kreiner has been researching stigma for most of his career, even going back to his doctoral program. But it wasn’t until he was asked to bring a business faculty lens to the national “Stop Stigma Together” initiative (which has roots at the University of Utah) that he began to see the potential for a new area of study.
Soon enough, he was turning his attention from start-up checklists, organizational structures, and team facilitation to something altogether more academic and yet simultaneously pragmatic: investigating how the stigma of mental health impacts the workplace, and ascertaining what companies are doing to deal with this modern reality.
Whereas previous research linking stigma to work was centered around such topics as undesirable occupations, or firms affected by a scandal or crisis, Kreiner wanted to join the personal level with the organizational level through a perspective of how employees’ mental health impacts workplace performance.
Many of the ensuing findings, he points out, make a compelling case for companies to invest in their employees’ mental well-being, even now that the pandemic feels like a distant bad memory.
Notably, every year sees 1 in 5 adults experiencing a serious mental health challenge, and 1 in 3 will be diagnosed with a mental illness at some point.
“When we realize the prevalence of it, then leaders start to say, ‘Oh, this isn’t something that’s out there, that’s far away, an ‘us vs. them’ kind of thing,’” said Kreiner. “It’s, ‘No, this could be happening to any one of us at any time,’ and that mindset shift changes the way we think about mental health.”
Promisingly, he has found that blanket “leave your problems at home” policies are becoming increasingly obsolete and giving way to more
nuanced solutions. Many C-suite executives are now choosing to invest in mental health resources, such as partnering with employee assistance programs or providing health insurance plans wherein therapy is a covered benefit.
We see a lot more organizations now focusing on mental health than we did before.”

Some of them do it out of a sense of altruism or ethics or morals or compassion. For those that don’t however, there is also a compelling financial case to be made.
Kreiner notes that in a study, 81% of workers identified mental health support as an important factor in choosing a workplace, and 62% of workers said they would stay at a job for robust mental health benefits.
In simple terms, there is a legitimate payoff for companies in the form of reduced absenteeism and turnover.
“One of the downstream consequences of this is it’s creating a competitive marketplace now for employees and potential employees,” Kreiner said. “So many organizations are providing robust mental health benefits and a robust mental health climate that organizations that don’t are falling behind in how attractive they are. They have to be doing a good job of [promoting] mental health or they’re not going to get people applying, they’re not going to get people staying.”
DEPARTMENT OF MARKETING
David Dolifka
When you’re driving home from work and considering your dinner options, what is it that prompts you to swing by a fast-food drive-thru as opposed to just using the items in your refrigerator and pantry that you’ve already paid for? Yes, convenience certainly plays a role, but as it turns out, happiness with your job could be a significant factor, too.
David Dolifka’s recent and ongoing research has delved into some of the habits, mindsets, and trends that impact consumer spending choices.
In one paper, he considers how people’s well-being at work impacts how they feel about and utilize their finances.
His basic premise was that if someone loves what they do, they may not feel like they require as much compensation to work in their job. As it turns out, irrespective of how much such people objectively earn on their paychecks, those in jobs they enjoy and derive satisfaction from tend to feel like they are well-paid.
“As a result of being compensated for something that you’re getting intrinsic, non-financial reward out of, you feel like, ‘Wow, this is icing on the cake. This is a bonus. I feel like I’m a rich person because I’m being paid so much to do something I love,’” Dolifka explained.
So, given that people who feel wealthy tend to spend more than those who do not, the downstream effect is that happy workers engage in more discretionary spending.
He ran 20 or so experiments, surveyed consumers about their working and spending experiences, and looked at government data on consumer finances to account for a wide range of variables that impact our lives
and attitudes. When controlling for such variables, the results were consistent: If you have two people with the same job title, same responsibilities, same salary, benefits, security — their jobs are identical in every way except that one loves what they do and the other does not — the former will more often perceive themself as being wealthier and will spend more on fun if not strictly necessary items.
“If people can start to understand their own psychological processes more and realize that, ‘Hey, I’m kind of susceptible on really good days of work, feeling like I’m making more money and then going and [spending] more,’” Dolifka said, “that can empower them through this awareness to possibly make better financial decisions if they’re so motivated.”
Another paper, meanwhile, focuses on the process of budgeting.
What he found is that people often budget based on how much they like budget categories, rather than focusing on how much they like specific items within categories. This is interesting — and potentially problematic — if people over-allocate to categories that are good on average, but that don’t actually contain many of the types of products and experiences people really want to buy. A useful approach to testing this theory is to compare whether people set budgets differently than they spend.
For example, he asked participants in an experiment to plan for a theoretical home remodel, and he divided them into two groups, “budgeters” and “spenders.” Budgeters had to decide how much money to allocate to four categories: furniture, appliances, televisions, and
structural improvements (such as painting or tile or bathroom fixture updates). Spenders, conversely, could focus on purchasing individual items or paying for specific tasks.
What he found is that “budgeters,” compared to “spenders,” devoted more money to the TVs category (because the TV category is generally well-liked) and routinely found themselves short-changing the three other categories. As a result, these hypothetical budgeters would end up more TVs than they might prefer, whereas spenders would pick items or structural improvements that would offer the most benefit, regardless of which category they belonged to.
If people can start to understand their own psychological processes…that can empower them through this awareness to possibly make better financial decisions.”

Budgeters set more aside for good categories, whereas spenders tend to focus on the best-liked options, regardless of which budget category it belongs to.
None of which means that setting a budget is bad or wrong, merely that it tends to evoke different questions or processes than pure spending. What this research shows is that the psychology of budgeting is distinct from the psychology of spending.
“Budgeting is so helpful for lots of reasons — for self-control, and just for the peace of mind of having a plan and being able to automate things and track things,” Dolifka said. “The takeaway is definitely not that budgeting is bad; the takeaway is that when people set budgets, they’re unaware of their own psychology that might lead them to set budgets they might otherwise wish had been slightly different.”
DAVID ECCLES SCHOOL OF BUSINESS
DEPARTMENT OF OPERATIONS AND INFORMATION SYSTEMS
Rohit Aggarwal
The ubiquity of artificial intelligence (AI) in the business world is inevitable, according to many devotees and practitioners. An increasingly familiar refrain to those still on the fence is that while you may not be supplanted by AI, you could be supplanted by someone who knows how to use it — the idea being to view it not as a potential replacement, but as a tool. There is some research, however, to suggest that for a growing number of young users, the “tool” best represented by AI is a crutch.
Rohit Aggarwal has studied the intersection of AI and business from numerous perspectives, and while he acknowledges how transformative to society it has already proven to be, some of his recent research, which investigates how using AI affects both productivity and learning, gives him serious pause.
In a partnered study, he ran several field experiments with roughly undergraduate-age participants from a coding boot camp, each of which involved a group of students who were not given access to AI tools, and a group of students who were initially provided AI access but then had it revoked partway through the class.
The first experiment featured students being given initially simple coding tasks, and then having the complexity ramp up as the class went along. Progress was measured in both accuracy and time of completion.
As expected, those with AI performed simple tasks with high accuracy and in a short timeframe, exceeding the performance of those without AI. Interestingly, though, as the tasks grew increasingly complex, those with AI saw their rate of progress slow and eventually plateau. And when their AI access was revoked for the final few and most complex tasks, many in this group could not complete them.
Meanwhile, the study found that while those who were not given AI access were at a disadvantage at the start, after a certain number of tasks their accuracy and time of completion started to eclipse that of the students who initially had AI.
For Aggarwal, there was a clear conclusion.
“We showed that early AI access can be detrimental,” he said. “There are people who may become overdependent on AI and may not develop the cognitive skills and the knowledge that is needed for completing complex tasks.”
What such students primarily lack, he explained, is the ability to deconstruct complex tasks into smaller, more manageable pieces, as well as mental models on how to systematically troubleshoot.
A second study followed a similar model, but rather than focusing on discrepancies between simple and complex tasks, it tracked performance differences between “experienced” and “inexperienced” developers.
There was a segment of students who had worked as professional developers in some other coding language before signing up for a class to learn a new one. Aggarwal and his partners were interested in seeing how their performance compared to that of students with little or no prior coding experience — once again with the control of no AI vs. granted/revoked AI.
One of the more intriguing findings encompassed those who had AI access: whereas the newbies tended to dump the entire task into the interface and ask it to complete the project, the experienced users usually deconstructed the task into smaller pieces, used more guided
prompts with specific instructions, and then reassembled the component pieces into a coherent whole at the end. The experienced users were superior in thinking through what information would be needed by the model to help them troubleshoot and debug certain errors they were seeing.
There are people who may become overdependent on AI and may not develop the cognitive skills and the knowledge that is needed for completing complex tasks.”
The AI granted/revoked segment of the group proved even more compelling when the experienced users did not experience a consequential decline in performance upon having their AI platform taken away. While they took a little longer than when they had AI access, they generally were still able to eventually complete the task without any significant difference in accuracy.
Such results reinforce for Aggarwal the notion that those who are exposed early to AI often become over-reliant upon it, to the point that it becomes a shortcut rather than a means to an end, because they focus on results at the expense of process. These people wind up disadvantaged as a result, because AI winds up becoming not a tool to help them do work more efficiently, but a substitute.
“For early learners, for those who do not already have some of that knowledge, instead of being supplementary, [AI usage] can be detrimental to their learning,” he said. “They do not have to learn, they do not have to troubleshoot on their own. They never learn how to approach a problem, how to creatively decompose a problem.”

DIVISION OF QUANTITATIVE ANALYSIS OF MARKETS AND ORGANIZATIONS
Jason Cook
There are many people who, for a variety of reasons, simply don’t have the ability to consistently provide for all their families’ basic needs and require help from assistance programs. Concurrent with this, though, is a perception among some that when a government provides such help, the people receiving it will inevitably take advantage and become disincentivized to work hard. But is that actually true?
Jason Cook has heard such critiques and decided to see for himself in a pair of co-authored papers.
One of them studies the effect of Supplemental Nutrition Assistance Program (SNAP) receipt on work decisions. It tests the assumption underpinning work requirements — that SNAP may cause people to work less — by using a random caseworker design.
The second paper tests work requirements directly, as opposed to the effect of SNAP receipt. Cook was interested in ascertaining the impact upon families with young children, considering them an important, policy-relevant group.
An unnamed “mountain plains” state used for the study has a provision wherein the moment a family’s youngest child reaches the age of 6, adult participants are no longer exempted from work requirements. This threshold made for a useful randomized control trial: Any factor that might impact program usage — local labor market conditions, socioeconomic status, racial/ethnic differences, et cetera — should be equally represented among people who have a 6-year-old versus people who have a 5-year-and-11-month-old.
“There’s not been a huge body of work on SNAP like there has been other safety net programs,” Cook explained. “So that was the initial impetus — this is a super-important program, crucial for our safety net, and it’s relatively understudied.”
Between the two papers, Cook and his fellow researcher tracked preand post-requirement work and program participation rates, shadowed caseworkers and assessed their degree of helpfulness, evaluated jobsearch training, and investigated complaints of excessive bureaucratic red tape.
What they found, in general, was that not only do work requirements not reveal (let alone reduce) rampant grift, but rather, that fewer people who legitimately needed aid actually got it, on account of either dropping out of or being kicked off the program.
They wound up looking at about 220,000 people who applied for SNAP. What they found was that only about 1 in 3 applicants were working in the 3 months before they applied for SNAP; for these people, getting SNAP didn’t change how much they worked because they weren’t working to begin with. Among applicants who were working before being approved for SNAP, the general trend was that they worked a little less for the first few months after receiving it (usually on account of some external shock that precipitated needing to apply for SNAP in the first place), but afterward wound up working more than before.
“The whole foundation, the underpinning for why you might think you would need work requirements — it’s not borne out by the data,” Cook
said. “It doesn’t look like giving people benefits dramatically reduces their work behavior in long run; it could even improve their work decisions.”
The other significant finding, meanwhile, is that when work requirements were enforced, about 26% of parents stopped receiving food aid — even though many still qualified.
There’s
not been a huge body of work on SNAP… this is a superimportant program, crucial for our safety net, and it’s relatively understudied.”
What Cook found was evidence of outsized “administrative burden” on applicants. More than half of those who lost benefits did so because they couldn’t keep up with complicated rules and paperwork, not because they made more money or didn’t want to work.
He even went through the SNAP application process himself to get firsthand knowledge of how imposing it actually is. Three-plus hours in, he found his mental bandwidth taxed by convoluted forms and documentation for childcare, utilities, and other expenses — all made even more difficult, he noted, if English is your second language. He added that it got even more complicated from there for those who make it through the initial application, with frequent caseworker meetings and regular documentation of job searches and résumé submissions required.
“Work reporting requirements, as they stand, are ineffective,” said Cook. “…That isn’t to say that they couldn’t be useful. If you restructure it to make it a voluntary program and make it meaningful — give real feedback, make the training useful — I think there could be a lot of benefit to that. But the mandatory nature of the current setup, it’s not working.”
