The Office of Nigeria’s immediate past Vice President, Prof. Yemi Osinbajo, SAN, GCON, has issued a stern warning to the public regarding the circulation of fraudulent AI-generated videos misusing his image and voice to promote bogus products and schemes.
In a statement signed by his media aide, Mohammed Braimah, the public was alerted to a growing number of deepfake videos falsely portraying Osinbajo as endorsing various commercial products and services. Among the manipulated content is a video in which a computer-generated likeness of the former Vice President appears to be advertising hypertension medication, and another where a cloned voice touts a dubious money-making scheme.
“These videos are entirely fake and misleading,” the statement read. “Prof. Osinbajo has no affiliation with any of the products or schemes being promoted in these clips.”
The public is urged to remain vigilant, verify endorsements before acting on them, and disregard all such misleading materials.
The emergence of deepfake technology has opened new avenues for online fraud, and this case serves as a stark reminder of how even respected public figures can be impersonated by malicious actors.
The Office of the former Vice President reiterated its commitment to fighting misinformation and protecting Osinbajo’s reputation, encouraging media outlets and the public alike to report any further deceptive content.