Patterns & AI
Encode 2.1
This first one was pretty easy to understand, because I've made grids before in p5. As I do the encode process more, I'm realizing that I understand how code works more than I can rewrite it verbatim. For example, I can't replicate the exact dimensioning in my head, but I know it's something proportional to the window. Or, I know of a way to do something but then the code uses a different way to achieve a similar effect to what I had in my head.
Encode 2.2
Similarly here, I thought that the code was just drawing three rectangles in an x-position for-loop, but instead it was working like the grid in the first example, taking into account y-position. Between the encoding and decoding, I find decoding the most fun but encoding is more useful because I'm beginning to systematically understand how to recreate art that I like with code.

Decode 2.1

Decode 2.2
I didn't really struggle with understanding these two sketches because they are a lot like 1. the sketch I made last week and 2. a patternmaker I made in ICM last semester. I repurposed one of my sketches from last week that also used createGraphic to fill in the half circles for 2.2.
Recode 2.1
It was really refreshing to see Vera Molnar's piece De La Serie (Des) Ordres from the slides and know exactly how I'd code it. It was really quick, especially using Decode 2.1 as a framework.
Recode 2.2
After using Github Copilot, I feel a lot better about my understanding of code. I wanted the designs to swap places, and when I gave my initial few prompts, it just kept giving me code that didn't work. When I would ask it to change things, it just wrote more functions that also didn't do what I visually wanted to do. However, I knew a better way of structuring the code / a different method and eventually had the AI start from scratch and gave it prompted for specifically that method. I was sort of testing the AI to see if it would try different methods on its own after I said something wasn't working, but very often it wouldn't change much and gave me code that didn't work or didn't even adhere to my prompts in the strictest sense. I can definitely see how AI needs to be used by people who understand what they want and know how to ask for it. I'm fairly confident that I could have achieved what the AI did, but it definitely would have taken me significantly more time. As of right now, I think I prefer Chat GPT over Github Copilot because I think it's better at adapting when things aren't working. I wouldn't say I was tempted to use AI otherwise this week, but that is probably because everything I wanted to do was either something I knew how to do already or (more often) I tend to tinker with sketches until I get something I like, rather than having a specific idea. Usually I just think: What if this moved? What if the colors were different? What if I added another shape? I like this part of the creative coding process the most, and I like being pleasantly surprised with what I've created. I think I would have been more tempted to use AI if I had outcomes that I needed to happen / constraints that I didn't know how to replicate. For example, having to recreate a complicated sketch from scratch just by looking at it without either doing any research or using old code I have as a framework.