What is image-based sexualised violence?
If someone publishes, shares or threatens to publish intimate images or videos of you, this is image-based (sexualised) violence.
Even if you have previously agreed to the pictures or recordings being taken, no one has the right to publish them. And certainly not to use them to blackmail you, or force you to get back together after a break-up.
Have you taken intimate photos/recordings with your boyfriend/partner, or sent him intimate photos of you?
Many people do sexting, which in itself is not a problem, as any pictures you take are only intended for the two of you: they are nobody else's concern. But if your partner or ex then threatens to publish these pictures or pass them on to others, this is not just a massive breach of trust, but can also constitute a criminal offence.
Has someone obtained intimate pictures of you?
Private or intimate images can at times be leaked by third parties. Perhaps a work colleague has found the images in an unsecured cloud, and now everyone has seen them, including your boss.
It can also be very distressing and have unpleasant consequences if intimate pictures or videos of you circulate in WhatsApp groups, and your parents or siblings, for example, see them.
Deepfakes and computer-generated images
Have you come across your face in pornographic material made from a scene that never actually took place?
It is now possible to use software to manipulate videos and photos so they look deceptively real. This is often done without the consent of the person concerned and can include:
- So-called “nudify” apps that edit images and videos to give the impression that the person depicted in them is naked, even though in reality they are not.
- Inserting a person's face into existing (pornographic) image and video material.
- Completely artificially generated content.