What’s the point of the Facebook ads you’ve seen:

Those in the data-mining scandal involving Cambridge Analytica are supposed to get you to click or call on a video or a product.

These ads aren’t just designed to inform people about the topics that run in the news. Cambridge Analytica would also shape the stories that ran, or not run, in your feed. They also created a methodology to chart how Facebook’s algorithms were influencing you.

This is a key part of Cambridge Analytica’s operation. Alex Tayler, the Trump campaign’s former digital director, says the company used similar techniques when it was working with voters in Alabama’s special US Senate election in 2014.

These voters, he said, never saw the ads itself. He designed them, he says, so Cambridge Analytica could then figure out “what behaviors or attitudes had come through on Facebook, what had not, and what work could be done.”

The microtargeting techniques that Cambridge Analytica developed—and its various projects with the Trump campaign—were in evidence through the spring of 2016. Facebook offered suggestions on which advertisements to create to reach voters on its pages, and the individual pages were still responsible for putting up the actual ads on their pages, says Richard Lai, a Google policy analyst who worked with the company on federal elections outreach and advocacy.

He says the ads will have similarly precise ways of tracking who clicks on them.

Here’s a sample of how a Facebook ad created by Cambridge Analytica in August 2014 appears to have worked. The screen is with an illustration of an audience segment (a group of potential voters) who read the headline of an article in a major news site like BuzzFeed, or had a Gmail account and went to a search page. Ads are placed where they will reach the audience. For example, the photo on the right of the image is of the logo for the Real Simple magazine, one of the articles in the image.

Here’s another Facebook ad created by Cambridge Analytica, with an illustration showing the subjects clicking on an article from The Atlantic or The Guardian. Again, advertisers put ads in front of the audience they think will be interested. The left side of the ad shows a picture of a participant (a voter) who has already clicked on the article. It indicates the articles were read and clicked on by the “targeted audience” and use the same wording.

About two-thirds of this image is a link to a site that pops up with audio (the “audio plays” line). The audio plays in the background; the soft piano melody you hear and note to put up is something called a “signature piece.” The text posted by the advertiser and the audio are aimed at women who are between the ages of 18 and 49. If the text isn’t read, however, the people within the image won’t hear the audio—and on Facebook, people tend to listen to and engage with the audio as well as the text.

These ads existed before Cambridge Analytica told all advertisers to take their data from Facebook’s social graph to build microtargeted ads. The company had been successfully selling them to clients for years. This explains why the Department of Justice recently charged the company with collecting personal data about tens of millions of Facebook users and selling them on to the Trump campaign—when the firm’s only real business was the microtargeting of a largely liberal audience.