Individual Write-Up & Reflection | Rebecca Pizzitola

Befores and Afters

Prior to this course, I thought that I am not creative enough to be a designer, work on Figma, or draw sketchy screens/low-fidelity prototypes. This class taught me that anyone can draw well enough for this type of work, and that I’m actually kind of a Figma god (and a little practice goes a long way!). That one drawing lecture has honestly changed my life, I find myself sketching things way more often, and feeling confident and proud of the outcome. I am an artist, and a skilled painter and drawer, but that lecture still taught me so much about how to be a quick and carefree and effective artist, which is priceless in this field. I was previously struggling with my typical artist perfectionism when it came to digital design and prototyping but now I don’t need to.

I also knew prior to CS 247B that our behavior is impacted by the world around us but I honestly thought we have more control, and that my issues with social media consumption, difficulties creating and sticking to new routines, etc., are completely my fault. The readings in this class made me realize that there’s so much more to it; I have to actively work against systems and apps designed to KEEP me doing bad behaviors and bad habits, which is really hard to do when you don’t know how to hack your own psychology to do so. For this reason, I liked doing all of the readings and the sketchnotes. The readings chosen were really effective and interesting, and they are things I absolutely want to implement in my own life. I don’t think the sketchnote with no follow up was effective for me and my memory.  I wish we were able to have some guided practice with it, and pick a few things throughout the quarter to try to implement. We did the measuring me assignment at the beginning, and I wish there was some follow up from that! It would have been really cool, and helpful for my understanding of behavior change, to try these methods in my own life; I think it can be more effective to try to change others’ behavior after we’ve spent some time changing our own, like how they make therapy students go to therapy before they can become licensed.

I also thought that ethics in technology was more cut and dry than it really is. I realize now after our 10 weeks of Thursday discussions that aside from really obvious forms of racism, inequalities, ableism, etc., it can actually be very difficult to determine (both individually and as a group) what is and isn’t ethical regarding technology. I’ve realized that pretty much everything, even things that are designed with the best of intentions can swing too far and become manipulative or unethical. It is also really hard to think about all the ways something can go wrong, and how many of today’s hardships honestly couldn’t have been anticipated (though many of them reasonably could have). I am taking away a tiny bit more empathy for designers, particularly small companies with small teams. I value ‘assume good intentions’ a tiny bit more now, while holding space for accountability, because there are so many wicked problems in design. I don’t know that it’s productive to discuss what is wrong with something unless we are also going to discuss how to fix it and how to keep that issue from occurring again. However, I still think every individual and every company can and should do their due diligence to make the best, most altruistic decisions possible, and be far more transparent with their decisions and the whys behind them. I also think that capitalism is putting too much pressure on designers to design for everyone, or have as wide of an audience as possible. I am now more curious about designing for more niche audiences, where possible, and wondering about designing multiple versions of an app interface that can accommodate the abilities and preferences of a lot of users. This could get REALLY bad, though. I’m remembering ‘separate but equal’ from the segregation days of America, and I wouldn’t want to recreate that digitally, but I also learned about the ‘different but just as good’ aspect of digital design replicating in-person stuff in CS 347, and I really appreciated that perspective and am curious where else it could be utilized.

 

Class experiences

What worked and didn’t work for you about the approach we followed?

As far as class structure and assignments go, I liked doing all of the readings and the sketchnotes; the readings chosen were really effective and interesting, and they are things I absolutely want to implement in my own life. I don’t think the sketchnote with no follow up was effective for me and my memory.  I wish we were able to have some guided practice with it, and pick a few things throughout the quarter to try to implement. We did the measuring me assignment at the beginning, and I wish there was some follow up from that! It would have been really helpful for my understanding of behavior change to have space to implement them in my routines and reflect on successes and failures in class.

 

What is still unresolved about the project for you?

I am very proud of the prototype we were able to create, but part of me doesn’t feel like it’s that good of a solution. I wanted to make it a priority to be able to give something back to the community we interviewed from, based on conversations I had in CS 347 about using a marginalized or under-resourced group for student design projects and not ever giving them anything tangible in return. And we did! Our Shortcuts idea was easily implementable and commodified for our testers and interviewees that had iPhones! But, it’s not a very sophisticated solution, and I don’t feel like our app does due justice for the users who aren’t new to the skin-picking intervention game (which is, honestly, most of us).

 

Ethical considerations

Nudging and Manipulation: What mechanisms does your project use to change behavior?  What makes them acceptable nudges?  Are there users for whom or use-cases of your project for which your mechanisms might become manipulative and why?

We use notifications, both app-centric and user-designed Shortcut automations, to nudge our users into behavior change. In my opinion, what makes these nudges acceptable is that they are opt-in, can be easily turned off or deleted, and are ultimately sought out by the user to remedy a harmful behavior they have, rather than us putting it onto them. These notifications require more consent and user input than the standard ‘can this app send you notifications’ pop-up of other apps. Informed consent, to me, is the easiest barrier to draw between nudging and manipulation. 

Something we didn’t really consider but that might have been worth considering is the fact that, since we utilize the Apple Shortcuts feature available on iPhones, we are aiding and abetting any sort of manipulation Apple may or may not have within the Shortcuts app. I remember finding out about Shortcuts for the first time and feeling pressure to ‘optimize my phone’ by utilizing them, and that might be a use-case/edge-user in which our nudges become harmful for the user.

 

Privacy:  How does your project respect users’ privacy?  What definition of privacy are you relying on in saying that it does so?  In what possible future uses or developments of your project might it no longer protect user’s privacy, and how could that development be avoided?

We chose not to store any sort of user data centrally, and only collect the bare minimum such as a first name and the context around their skin-picking in order to recommend appropriate interventions, and we decided that in a world where this was developed, that we would perhaps store it on the user’s device rather than collect. I think the fact that we aren’t asking for even a log-in, and therefore not collecting emails or personal information is a huge win for user privacy in a world where you need a log-in for everything. There are trade-offs with this approach though, first that the users wouldn’t be able to download the app on multiple devices, and would need a phone backup if they wanted to transfer data to a new one. That’s more work for them. Second, it would probably be harder to offer really good personalized insights, or have any sort of ‘here’s what others find helpful,’ without collecting data. In the future, if we wanted to develop some of those features, another conversation around encryption or anonymization would need to happen.

Avatar

About the author