knowt logo

CCT218 Exam Guide:

Social construction of technology: Technologies are the product of social processes, political

choices, local cultures, etc. This includes the cultures of the engineers who design them and the interests of investors and governments. It asks why is a technology shaped the way it is and from which worldviews and interests it emerged. Ex, algorithms of Facebook, designed to capture users and adapt in order to gather more personal data and fulfill business model.

Agenda setting: As part of the powers of mass media, they may not convince people, but can decide what the public will see as matters of interest that need to be discussed. Ex, if the media reports repetitively about police violence/BLM, this doesn’t mean that people will change their position about the issue. They will discuss it as a matter of public interest.

Remediation: The relation of rivalry and cooperation, competition, and homage, among different media forms and technologies. The “newness” of media is not linear, but the product of a continuous influence between older and newer media forms. This process never ends as long as they remain part of a lively media ecology. Ex, The Ebook is an evolution of thousands of years of book technology - adopting many features of traditional books (pages that can be turned, table of content).

Male gaze: Women in film or other forms of content are represented as objects to be seen and looked at rather than as active subjects. Men embody power, potential, and capability, whereas women's presence is characterized by being seen, alone, by men. This serves to objectify women, making them the object of the gaze. Ex, Mulvey describes how women in film and advertising are fragmented in close-ups - headless bodies, disarticulated legs, lips, and torsos, are used to sell anything from perfume to burgers, sneakers to cars.

Datafication: Process in which information must be collected and made usable to the algorithm. Digital technologies “datafy” behaviors and interactions, or transform them into data that can be analyzed and then used to make choices. Ex, G​​oogle Maps APIs, which is used by large numbers of third-party applications to gain access to geographic data and interactive maps. Entire ecosystems grow around each major platform, enabling other actors to participate in the platform economy. Google has at least partial control over how other applications use the data, and thus over how these data shape user behaviour.

Algorithmic bias: How the design of a software system is baked with social and political biases that are already present in our societies that are reproduced (and reinforced).  Ex, HP computer’s facial recognition system fails to capture black faces, but never fails to notice a white face. Hp’s answer was that black faces were difficult to capture for “technical” reasons but fails to acknowledge the potential biases in the trained datasets and the programmers themselves.

Surveillance capitalism: Describes the economic systems that sustain surveillance through the ability of corporations to extract relevant data from any human behaviour.  Datafication, algorithmic  sorting, and commodification are processes that drive this. Ex, Sidewalk Labs (a subsidiary of Google), proposed a “smart” neighbourhood in downtown Toronto, that would be heavily surveilled - running the risk of data collected being sold to third parties, thus supporting Google’s business model.

Encoding/decoding: There are no single true meanings, but meanings are encoded in messages and decoded by readers. Encoding is the production of a message to convey a certain message (through symbols). Decoding is the process through which an audience member interprets the message. Ex, the Canadian flag can be coded to mean Canada’s autonomy from Britain or the unity of the country. This can be decoded, and audience members can develop a negotiated/oppositional stance - the flag representing an authoritarian government or the legalization of drugs.

What is Technology? (Ursula LeGuin): Technology is the active human interface with the material world. It is how a society copes with physical reality: how people get and keep and cook food, how they clothe themselves, what their power sources, etc. Technology and Hi-tech is not synonymous.

Theories of technology: Technological Determinism, Social construction of technology, Co-production of Technology and Society, Users matter.

Technological Determinism: Views how technologies have changed societies. Technologies enter society from the outside, and are external and independent forces that have a strong impact on individual and social life. Tend to assume that technology is an influential force, that shapes the overall form a society takes. Ex, that social networks/movements will give rise to the true, liberated selves.

Co-production of Technology: Theory in which both technology and society both influence each other continuously. Technologies allow and constrain people’s behaviour through a series of limitations and possibilities. In turn, people shape technologies through use, by adopting, rejecting, or modifying technologies or some of their characteristics.  Ex, Wikipedia is dominated by male editors, while feminist groups/users push back with more content.

Users Matter: The user is the human factor interacting with machines. Technology consumption is a complex cultural process with users as key actors who shape technology by using it and adapting it to their needs.

Affordances: Technology can both enable and limit the ways it can be used. Ex,  Apple products, features and applications limited to the IOS

Domestication: A process in which technologies become wildly used and become

integrated parts of everyday life as people invest in it with their own meanings and significance.

Theories of Media: Effects theories, Reception Theories, Critical Theories, Political Economy, Users and Gratifications

Effects Theories:  Looks at how media affect the way audiences behave and how it can shape a society. Ie. Hypodermic Needle theory. In a short term, can influence to affect our daily agenda; the long-term effects are more complex. Ex, violent video games cause violence in teenagers - Super Columbine Massacre RPG.

Reception Theories Focus on the receiving end of media content - what audiences do to the media. Posit media consumption as active and not passive Meanings are encoded by media producers but later decoded in different ways by their publics. Ex, some of the survivors of the Columbine High mass school shooting said that the game had helped them understand what had happened.

Users and Gratifications: Looks at what the audience does with media. Studies the ways media audiences use media to satisfy certain needs. Ex, to get information; present their identity in public or assert their power; engage in social interactions; entertain.

Critical Theories: Includes critical race and gender studies. Ex, Frankfurt School - linked modern media to the rise of capitalism--they said that through their power over the

media system, the state and corporations control how we think.

Political Economy: Looks at media and technology from the viewpoint of the nexus between money and power. Is concerned with labour (how people work with tech), value (how is value created and appropriated through tech) and property (who owns the media). Ex, googling unprofessional hairstyles.

New Media: Problematic because of the ideological implication that “new” means “better” - it masks the fact that all media were once new. Can be useful because avoids focusing on a single technology/characteristic and points to the emergence and evolution of new technologies as one of the core aspects of contemporary societies.

Identity Crisis (Gitelman): A uncertain phase where a new media technology emerges, meaning and role are in flux, and the technology is more open to creative practices or unpredictable uses. Resolved when the perceptions of the medium, as well as its practical uses, are adapted to existing categories of public understanding about what that medium does for whom.

Technology Domestication: The process of the acceptance of new technology in people’s everyday lives and the stabilization of its meaning and role. Ultimate meanings or functions of new media are shaped over time by that society's existing habits of media use, by shared desires for new uses, and by the slow process of adaptation between the two.

Linear Evolution of Media: Common tropes for imagined media futures - supercession and transparaency. Old is built on new, ex. Telephones -> phones. Tend to erase their own historical contexts and the producers and adopters of new media forms tend to represent them as better than the old ones.

Supercession: The notion that each new medium "vanquishes or subsumes its predecessors. Associated with the Linear Evolution of Media.

Transparency: The assumption that each new medium mediates less, that it "frees" information from the constraints of previously inadequate or "unnatural" media forms that represented reality less perfectly. Associated with the Linear Evolution of Media.

Imagined media pasts: supporters of older media forms tend to say that they are more authentic, i.e. printed book gives readers a more authentic or deeper relation with both the content and the technology.

Dead Media (Gitelman):  All new media will sooner or later become old - relates to planned obsolenence. Short lived and failed media, which have disappeared and seem bizarre today, have still left traces in the world.

Planned obsolenence: A process which the industry designs technologies that are meant to become old faster and faster. Relates to how new media will soon become the new old media. Ex, The ever -accelerating pace at which we replace our computers or phones.

Undead media: Media whose death was announced over and over again and yet insist on being alive. Ex, the printed newspaper.

Zombie media: Media that die and are then resurrected to new uses and contexts - has to be commercially dead at one point.. Proves how new media can remain new through the agency of users. Ex, Gameboys or 8- bit music.

Algorithms: Sets of encoded procedures to process data and instructions and produce an

output based on calculations. These are programs that  follow procedural logics in order to

generate specific outputs. Are based on the automation of functions like selection, curation etc.  Ex, Search, pattern   recognition,   recommendation,   auto-correction,   predicting,   profiling,   simulation

Public relevance algorithms (Gillespie): Algorithms have the power to enable and assign meanings - through relevancy.  That have a key role in producing and certifying knowledge and shaping behaviours, that is they have the power to shape socio-political and cultural worlds.

Selection: Algorithms automate the use of data to organize activities, turning inputs into outputs through the selection/curation of most relevant topics, terms, objects, offers, etc.  They sort information to adapt to and modify user behaviour. Core features include personalization, moderation, and forecasting.

Personalization: Platforms algorithmically determine the interests, desires, and needs of

each user on the basis of a wide variety of datafied user signals. Has to do with improving UX but also with commercial logics. It is data-intensive - based on the ability to capture valuable data from user activity. Ex, calculation of gender.

Forecasting: Many platforms are based on cycles of anticipation to predict users’ actions and create incentives to remain on the platform. Has to do with the ability to statistically weight the linkliness of an event. Ex, Netflix’s recommendation system.

Objectivity: A false image where tech companies tend to present their services as neutral, cold machines, that can ‘stabilize truth’ free from human bias, subjectivity, external influence, and errors. They are NOT NEUTRAL, tend to be perceived as black boxes and reflect these actors’ biases, values and interests. Ex, Google’s page ranking.

Human Curation: Many tech companies actively hide the human labour behind algorithmic systems. Users and workers are included in algorithm functioning or training - often   AI/algorithmic systems can not function without humans. Technology creates new tasks for humans rather than automating work out of existence.  Ex, content moderation farms in Php.

Commodification: Transforms activities into tradable commodities or into basis for commodification of other aspects of a platform or service. Is intensified by mechanisms of datafication as the massive amount of user data collected and processed provide insight into

users at particular moments in time. It is pervasive as our media ecosystem is highly privatized.

Surveillance capitalism (Zuboff): Is the idea that everything we do through digital mediation feeds capital’s profits.Commodification processes are based on different business models - subscriptions, personalized ads, etc.

Methodological challenges (to algorithms): Are the three main logics that underpin the functioning of algorithmic media. Blackboxed, Heterogenous, and Dynamic.

Blackboxed: Algorithms areproprietary and not open to scrutiny, created in corporate environments, code is subject to IPRs, and non-disclosure agreements. Company limit access for commercial reasons but also to limit people’s ability to game the system.

Heterogenous: Algorithms are not easy to deconstruct as they are part of complex systems with hundreds of other algorithms, thousands of datasets, are the product of collective work. Ex, no single person grasps the functioning of PageRank

Dynamic: Algorithms are moving targets - not fixed but constantly unfold in multiple ways. Each user is targeted individually; at any given time, companies run dozens of different versions; randomness can be built into an algorithm's design to make it less predictable.

Strategies to study algorithms: ANALYZE/REPRODUCE CODE, REVERSE ENGINEERING (experiment in practic), INTERVIEW DESIGNERS (to understand choices, values, decision, the story), ACESS COMPANY DOCS (in relation to corporate goals and financial logics), STUDY “IN THE WILD” (how they are deployed in different contexts and how they perform tasks).

Media representations: The ways in which the media portrays a certain group, event, or topics. They do not overlap completely with the underlying reality - are filtered through ideological lenses, values, and perspectives of media producers. Audiences are active, and may assign their own meanings to a specific representation.

Systems of representation (Hall):  Consists of the mental representations we carry around in our minds and how shared meanings  are constructed and represented within the language (which includes images and symbols).  There is a strong connection between representation and systems of power.

Hegemonic reppresentations: Ways in which certain subjects or groups are represented - sets standards for what’s normal. Are critiqued because some groups tend to be framed and portrayed in limited, repetitive ways. Ex, Lara Croft or counter example the Great Canadian Baking Show.

The Bechdel Test: A way to look at the lack of representation of women in movies. Limitations - fails to consider the context of this presence or the message of the film, demonstrates the persistence of representational inequities in quantitative terms but not  qualitatively.

Tropes: Linguistic devices that help viewers, readers, listeners, and players to identify with

characters. These are culturally specific. Ex, the nerd - sets norms of what is masculine.

Stereotypes: Are broadly understood, socially constructed, oversimplified images or ideas about certain 'types' of people. Exist beyond the media but circulate in a  manner that is not only easy to recognize but also further perpetuates narrow ideas about cultures, communities, groups, and identities.

oppositional gaze: Bell Hooks contested the conceptualization of the gaze by saying that Black female spectators may actively choose not to identify with the imaginary subjects of a film because such identification is disenabling for them.

Reappropriation: A process that challenges, disrupts, or inverts the  white male gaze through by anti-or contra-straight and often playful and subversive esponses to stereotypical representations. Ex, queer cultures.

Coded gaze (Buolamwini): The way in which the male (or white) gaze is baked into algorithmic technology. Software can be subject to racial and gender biases that reflect  the  views  and  cultural   backgrounds of  those  who develop it. Ex, Google results for black women and girls calls back to tropes and hypersexualization - black women are also not employed in Google.

Incoding: One can see algorithmic bias as the product of design choices made by specific people working within specific relations of power and value systems. An active process of social construction of a technology.

Digital Echo Chamber: Can reinforce prior political views due to selective exposure to political content selected by algorithmic technologies. BUT the primary driver is the actions of users—who we connect with online and which stories we click on. Can help support the formation of an ideal public sphere that at least partially removes ideological  clashes.

Ideal Public Sphere: A space (both physical or moderated) for public discussion which is open and inclusive, concerned with matters of common political interest, based on physical spaces as well as media. In these spaces individuals that make up civil society are free to discuss politics without being subject to an external authority, and thus form a public opinion.

Emergence of the Public Sphere (Habermas): 18th century in Europe -made possible by the emergence and establishment of places which were public, inclusive, open, and concerned with matters of common political concern. Ex, French salons, British coffee houses and newspapers

Critiques of the Public Sphere: Criticized as an idealized view of our societies based on liberal ideologies that remove conflicts and inequalities. Includes Femenist and Class-based critiques.

Feminist critique: Who can access the public sphere in a proactive and meaningful manner? Coffee Houses and Salons were primarily dominated by bourgeois white men. Reading, writing and politics were not afforded for women.

Class-based critique: Members of bourgeoisie and economic elites have more power to influence the public sphere. Powerful elites maintain control on other classes by making them accept its ideology as normal and desirable, as something that benefits all classes (Gramsci).

Power of mass media: It has a strong power over the public sphere. The public sphere has never been a fully democratic space. Supported by Watchdog(s), Gatekeeping and Agenda Setting.

Watchdog: Mass media independent from governments and parties help create a free public sphere and can monitor the actions of the powerful - a vital democratic function. Ex, they can sustain professional journalists that identify the most important issues to be put on public agenda for debate.

Gatekeeping: professional information producers have the power to decide which information is distributed to the public (giving themthe ability to filter and direct news), and thus gets to influence what people talk about.

Digital Public Sphere: interactions occur on digital platforms that allow all users to be consumers but also producers of information - democratizing access to the contemporary public sphere.

 Disintermediation  processes: The way digital media democratized acess to the production/circulation of information through increased independence  from professionals   who traditionally have the role of intermediaries between audiences and information. Tend to be organized around the blurring of public and private

Nature of the Public Sphere: Is actually highly privatized and this affects how power is distributed across different actors (Ex, Canada’s top 5 corps.). Is subject to state censorship and control - the case for democratic and authoritarian countries alike. Traditional obstacles such as racism and sexism, remain firmly embedded in online discussion space.

Connective action: A form of political action  in which movements counter established power or act for social change through social networking, meaningful connections and sharing of information. Ex,  #idlenomore in response to Bill C45

Blogosphere: Blogs can be characterized by homophily, as they tend to link information

sources with which they share a common political ideology, thus reducing diversity and exchange among different perspectives. Can be an example of Algorthmic gatekeeping.

Political homophily: People tend to prefer information that reinforces partisan sources over that which includes different voices. Has to do with a general polarization of our society.

Filter bubbles: How social media platforms being new gatekeepers, mediated by algorithmic crunching/technology. Are mainly propelled by our own decisions rather than algorithms.

Itunes: When algorithmic recommendation kick in, it can increasethe  diversity of music and help users explore and branch into new interests - a case with no evidence for an echo chamber. We tend to be more flexible when it comes to music than with our political ideas.

Facebook: Factors of influence for news feed include the network of friends, algorithm (selects the content) and self-decisions/will to click on something. Profiles that tend to follow each other tend to be like minded.

[a]what would be the difference between datafication and data capture?

KC

CCT218 Exam Guide:

Social construction of technology: Technologies are the product of social processes, political

choices, local cultures, etc. This includes the cultures of the engineers who design them and the interests of investors and governments. It asks why is a technology shaped the way it is and from which worldviews and interests it emerged. Ex, algorithms of Facebook, designed to capture users and adapt in order to gather more personal data and fulfill business model.

Agenda setting: As part of the powers of mass media, they may not convince people, but can decide what the public will see as matters of interest that need to be discussed. Ex, if the media reports repetitively about police violence/BLM, this doesn’t mean that people will change their position about the issue. They will discuss it as a matter of public interest.

Remediation: The relation of rivalry and cooperation, competition, and homage, among different media forms and technologies. The “newness” of media is not linear, but the product of a continuous influence between older and newer media forms. This process never ends as long as they remain part of a lively media ecology. Ex, The Ebook is an evolution of thousands of years of book technology - adopting many features of traditional books (pages that can be turned, table of content).

Male gaze: Women in film or other forms of content are represented as objects to be seen and looked at rather than as active subjects. Men embody power, potential, and capability, whereas women's presence is characterized by being seen, alone, by men. This serves to objectify women, making them the object of the gaze. Ex, Mulvey describes how women in film and advertising are fragmented in close-ups - headless bodies, disarticulated legs, lips, and torsos, are used to sell anything from perfume to burgers, sneakers to cars.

Datafication: Process in which information must be collected and made usable to the algorithm. Digital technologies “datafy” behaviors and interactions, or transform them into data that can be analyzed and then used to make choices. Ex, G​​oogle Maps APIs, which is used by large numbers of third-party applications to gain access to geographic data and interactive maps. Entire ecosystems grow around each major platform, enabling other actors to participate in the platform economy. Google has at least partial control over how other applications use the data, and thus over how these data shape user behaviour.

Algorithmic bias: How the design of a software system is baked with social and political biases that are already present in our societies that are reproduced (and reinforced).  Ex, HP computer’s facial recognition system fails to capture black faces, but never fails to notice a white face. Hp’s answer was that black faces were difficult to capture for “technical” reasons but fails to acknowledge the potential biases in the trained datasets and the programmers themselves.

Surveillance capitalism: Describes the economic systems that sustain surveillance through the ability of corporations to extract relevant data from any human behaviour.  Datafication, algorithmic  sorting, and commodification are processes that drive this. Ex, Sidewalk Labs (a subsidiary of Google), proposed a “smart” neighbourhood in downtown Toronto, that would be heavily surveilled - running the risk of data collected being sold to third parties, thus supporting Google’s business model.

Encoding/decoding: There are no single true meanings, but meanings are encoded in messages and decoded by readers. Encoding is the production of a message to convey a certain message (through symbols). Decoding is the process through which an audience member interprets the message. Ex, the Canadian flag can be coded to mean Canada’s autonomy from Britain or the unity of the country. This can be decoded, and audience members can develop a negotiated/oppositional stance - the flag representing an authoritarian government or the legalization of drugs.

What is Technology? (Ursula LeGuin): Technology is the active human interface with the material world. It is how a society copes with physical reality: how people get and keep and cook food, how they clothe themselves, what their power sources, etc. Technology and Hi-tech is not synonymous.

Theories of technology: Technological Determinism, Social construction of technology, Co-production of Technology and Society, Users matter.

Technological Determinism: Views how technologies have changed societies. Technologies enter society from the outside, and are external and independent forces that have a strong impact on individual and social life. Tend to assume that technology is an influential force, that shapes the overall form a society takes. Ex, that social networks/movements will give rise to the true, liberated selves.

Co-production of Technology: Theory in which both technology and society both influence each other continuously. Technologies allow and constrain people’s behaviour through a series of limitations and possibilities. In turn, people shape technologies through use, by adopting, rejecting, or modifying technologies or some of their characteristics.  Ex, Wikipedia is dominated by male editors, while feminist groups/users push back with more content.

Users Matter: The user is the human factor interacting with machines. Technology consumption is a complex cultural process with users as key actors who shape technology by using it and adapting it to their needs.

Affordances: Technology can both enable and limit the ways it can be used. Ex,  Apple products, features and applications limited to the IOS

Domestication: A process in which technologies become wildly used and become

integrated parts of everyday life as people invest in it with their own meanings and significance.

Theories of Media: Effects theories, Reception Theories, Critical Theories, Political Economy, Users and Gratifications

Effects Theories:  Looks at how media affect the way audiences behave and how it can shape a society. Ie. Hypodermic Needle theory. In a short term, can influence to affect our daily agenda; the long-term effects are more complex. Ex, violent video games cause violence in teenagers - Super Columbine Massacre RPG.

Reception Theories Focus on the receiving end of media content - what audiences do to the media. Posit media consumption as active and not passive Meanings are encoded by media producers but later decoded in different ways by their publics. Ex, some of the survivors of the Columbine High mass school shooting said that the game had helped them understand what had happened.

Users and Gratifications: Looks at what the audience does with media. Studies the ways media audiences use media to satisfy certain needs. Ex, to get information; present their identity in public or assert their power; engage in social interactions; entertain.

Critical Theories: Includes critical race and gender studies. Ex, Frankfurt School - linked modern media to the rise of capitalism--they said that through their power over the

media system, the state and corporations control how we think.

Political Economy: Looks at media and technology from the viewpoint of the nexus between money and power. Is concerned with labour (how people work with tech), value (how is value created and appropriated through tech) and property (who owns the media). Ex, googling unprofessional hairstyles.

New Media: Problematic because of the ideological implication that “new” means “better” - it masks the fact that all media were once new. Can be useful because avoids focusing on a single technology/characteristic and points to the emergence and evolution of new technologies as one of the core aspects of contemporary societies.

Identity Crisis (Gitelman): A uncertain phase where a new media technology emerges, meaning and role are in flux, and the technology is more open to creative practices or unpredictable uses. Resolved when the perceptions of the medium, as well as its practical uses, are adapted to existing categories of public understanding about what that medium does for whom.

Technology Domestication: The process of the acceptance of new technology in people’s everyday lives and the stabilization of its meaning and role. Ultimate meanings or functions of new media are shaped over time by that society's existing habits of media use, by shared desires for new uses, and by the slow process of adaptation between the two.

Linear Evolution of Media: Common tropes for imagined media futures - supercession and transparaency. Old is built on new, ex. Telephones -> phones. Tend to erase their own historical contexts and the producers and adopters of new media forms tend to represent them as better than the old ones.

Supercession: The notion that each new medium "vanquishes or subsumes its predecessors. Associated with the Linear Evolution of Media.

Transparency: The assumption that each new medium mediates less, that it "frees" information from the constraints of previously inadequate or "unnatural" media forms that represented reality less perfectly. Associated with the Linear Evolution of Media.

Imagined media pasts: supporters of older media forms tend to say that they are more authentic, i.e. printed book gives readers a more authentic or deeper relation with both the content and the technology.

Dead Media (Gitelman):  All new media will sooner or later become old - relates to planned obsolenence. Short lived and failed media, which have disappeared and seem bizarre today, have still left traces in the world.

Planned obsolenence: A process which the industry designs technologies that are meant to become old faster and faster. Relates to how new media will soon become the new old media. Ex, The ever -accelerating pace at which we replace our computers or phones.

Undead media: Media whose death was announced over and over again and yet insist on being alive. Ex, the printed newspaper.

Zombie media: Media that die and are then resurrected to new uses and contexts - has to be commercially dead at one point.. Proves how new media can remain new through the agency of users. Ex, Gameboys or 8- bit music.

Algorithms: Sets of encoded procedures to process data and instructions and produce an

output based on calculations. These are programs that  follow procedural logics in order to

generate specific outputs. Are based on the automation of functions like selection, curation etc.  Ex, Search, pattern   recognition,   recommendation,   auto-correction,   predicting,   profiling,   simulation

Public relevance algorithms (Gillespie): Algorithms have the power to enable and assign meanings - through relevancy.  That have a key role in producing and certifying knowledge and shaping behaviours, that is they have the power to shape socio-political and cultural worlds.

Selection: Algorithms automate the use of data to organize activities, turning inputs into outputs through the selection/curation of most relevant topics, terms, objects, offers, etc.  They sort information to adapt to and modify user behaviour. Core features include personalization, moderation, and forecasting.

Personalization: Platforms algorithmically determine the interests, desires, and needs of

each user on the basis of a wide variety of datafied user signals. Has to do with improving UX but also with commercial logics. It is data-intensive - based on the ability to capture valuable data from user activity. Ex, calculation of gender.

Forecasting: Many platforms are based on cycles of anticipation to predict users’ actions and create incentives to remain on the platform. Has to do with the ability to statistically weight the linkliness of an event. Ex, Netflix’s recommendation system.

Objectivity: A false image where tech companies tend to present their services as neutral, cold machines, that can ‘stabilize truth’ free from human bias, subjectivity, external influence, and errors. They are NOT NEUTRAL, tend to be perceived as black boxes and reflect these actors’ biases, values and interests. Ex, Google’s page ranking.

Human Curation: Many tech companies actively hide the human labour behind algorithmic systems. Users and workers are included in algorithm functioning or training - often   AI/algorithmic systems can not function without humans. Technology creates new tasks for humans rather than automating work out of existence.  Ex, content moderation farms in Php.

Commodification: Transforms activities into tradable commodities or into basis for commodification of other aspects of a platform or service. Is intensified by mechanisms of datafication as the massive amount of user data collected and processed provide insight into

users at particular moments in time. It is pervasive as our media ecosystem is highly privatized.

Surveillance capitalism (Zuboff): Is the idea that everything we do through digital mediation feeds capital’s profits.Commodification processes are based on different business models - subscriptions, personalized ads, etc.

Methodological challenges (to algorithms): Are the three main logics that underpin the functioning of algorithmic media. Blackboxed, Heterogenous, and Dynamic.

Blackboxed: Algorithms areproprietary and not open to scrutiny, created in corporate environments, code is subject to IPRs, and non-disclosure agreements. Company limit access for commercial reasons but also to limit people’s ability to game the system.

Heterogenous: Algorithms are not easy to deconstruct as they are part of complex systems with hundreds of other algorithms, thousands of datasets, are the product of collective work. Ex, no single person grasps the functioning of PageRank

Dynamic: Algorithms are moving targets - not fixed but constantly unfold in multiple ways. Each user is targeted individually; at any given time, companies run dozens of different versions; randomness can be built into an algorithm's design to make it less predictable.

Strategies to study algorithms: ANALYZE/REPRODUCE CODE, REVERSE ENGINEERING (experiment in practic), INTERVIEW DESIGNERS (to understand choices, values, decision, the story), ACESS COMPANY DOCS (in relation to corporate goals and financial logics), STUDY “IN THE WILD” (how they are deployed in different contexts and how they perform tasks).

Media representations: The ways in which the media portrays a certain group, event, or topics. They do not overlap completely with the underlying reality - are filtered through ideological lenses, values, and perspectives of media producers. Audiences are active, and may assign their own meanings to a specific representation.

Systems of representation (Hall):  Consists of the mental representations we carry around in our minds and how shared meanings  are constructed and represented within the language (which includes images and symbols).  There is a strong connection between representation and systems of power.

Hegemonic reppresentations: Ways in which certain subjects or groups are represented - sets standards for what’s normal. Are critiqued because some groups tend to be framed and portrayed in limited, repetitive ways. Ex, Lara Croft or counter example the Great Canadian Baking Show.

The Bechdel Test: A way to look at the lack of representation of women in movies. Limitations - fails to consider the context of this presence or the message of the film, demonstrates the persistence of representational inequities in quantitative terms but not  qualitatively.

Tropes: Linguistic devices that help viewers, readers, listeners, and players to identify with

characters. These are culturally specific. Ex, the nerd - sets norms of what is masculine.

Stereotypes: Are broadly understood, socially constructed, oversimplified images or ideas about certain 'types' of people. Exist beyond the media but circulate in a  manner that is not only easy to recognize but also further perpetuates narrow ideas about cultures, communities, groups, and identities.

oppositional gaze: Bell Hooks contested the conceptualization of the gaze by saying that Black female spectators may actively choose not to identify with the imaginary subjects of a film because such identification is disenabling for them.

Reappropriation: A process that challenges, disrupts, or inverts the  white male gaze through by anti-or contra-straight and often playful and subversive esponses to stereotypical representations. Ex, queer cultures.

Coded gaze (Buolamwini): The way in which the male (or white) gaze is baked into algorithmic technology. Software can be subject to racial and gender biases that reflect  the  views  and  cultural   backgrounds of  those  who develop it. Ex, Google results for black women and girls calls back to tropes and hypersexualization - black women are also not employed in Google.

Incoding: One can see algorithmic bias as the product of design choices made by specific people working within specific relations of power and value systems. An active process of social construction of a technology.

Digital Echo Chamber: Can reinforce prior political views due to selective exposure to political content selected by algorithmic technologies. BUT the primary driver is the actions of users—who we connect with online and which stories we click on. Can help support the formation of an ideal public sphere that at least partially removes ideological  clashes.

Ideal Public Sphere: A space (both physical or moderated) for public discussion which is open and inclusive, concerned with matters of common political interest, based on physical spaces as well as media. In these spaces individuals that make up civil society are free to discuss politics without being subject to an external authority, and thus form a public opinion.

Emergence of the Public Sphere (Habermas): 18th century in Europe -made possible by the emergence and establishment of places which were public, inclusive, open, and concerned with matters of common political concern. Ex, French salons, British coffee houses and newspapers

Critiques of the Public Sphere: Criticized as an idealized view of our societies based on liberal ideologies that remove conflicts and inequalities. Includes Femenist and Class-based critiques.

Feminist critique: Who can access the public sphere in a proactive and meaningful manner? Coffee Houses and Salons were primarily dominated by bourgeois white men. Reading, writing and politics were not afforded for women.

Class-based critique: Members of bourgeoisie and economic elites have more power to influence the public sphere. Powerful elites maintain control on other classes by making them accept its ideology as normal and desirable, as something that benefits all classes (Gramsci).

Power of mass media: It has a strong power over the public sphere. The public sphere has never been a fully democratic space. Supported by Watchdog(s), Gatekeeping and Agenda Setting.

Watchdog: Mass media independent from governments and parties help create a free public sphere and can monitor the actions of the powerful - a vital democratic function. Ex, they can sustain professional journalists that identify the most important issues to be put on public agenda for debate.

Gatekeeping: professional information producers have the power to decide which information is distributed to the public (giving themthe ability to filter and direct news), and thus gets to influence what people talk about.

Digital Public Sphere: interactions occur on digital platforms that allow all users to be consumers but also producers of information - democratizing access to the contemporary public sphere.

 Disintermediation  processes: The way digital media democratized acess to the production/circulation of information through increased independence  from professionals   who traditionally have the role of intermediaries between audiences and information. Tend to be organized around the blurring of public and private

Nature of the Public Sphere: Is actually highly privatized and this affects how power is distributed across different actors (Ex, Canada’s top 5 corps.). Is subject to state censorship and control - the case for democratic and authoritarian countries alike. Traditional obstacles such as racism and sexism, remain firmly embedded in online discussion space.

Connective action: A form of political action  in which movements counter established power or act for social change through social networking, meaningful connections and sharing of information. Ex,  #idlenomore in response to Bill C45

Blogosphere: Blogs can be characterized by homophily, as they tend to link information

sources with which they share a common political ideology, thus reducing diversity and exchange among different perspectives. Can be an example of Algorthmic gatekeeping.

Political homophily: People tend to prefer information that reinforces partisan sources over that which includes different voices. Has to do with a general polarization of our society.

Filter bubbles: How social media platforms being new gatekeepers, mediated by algorithmic crunching/technology. Are mainly propelled by our own decisions rather than algorithms.

Itunes: When algorithmic recommendation kick in, it can increasethe  diversity of music and help users explore and branch into new interests - a case with no evidence for an echo chamber. We tend to be more flexible when it comes to music than with our political ideas.

Facebook: Factors of influence for news feed include the network of friends, algorithm (selects the content) and self-decisions/will to click on something. Profiles that tend to follow each other tend to be like minded.

[a]what would be the difference between datafication and data capture?