Aymen Rebouh
Eureka Engineering
Published in
3 min readSep 3, 2018

--

When you meet someone for the first time, It usually takes a few seconds for this person to have a first impression about you.

CIDetector and Pictures analysis

It’s in my opinion the same for online dating and your profile pictures. Your profile pictures can have a big impact in your online dating experience because It’s what people see first.

There’s immense value in pictures analysis and there are so many interesting use cases. For example, some mobiles applications analyze photos and apply filters on it to make the pictures communicate more emotions like Snapchat. You can analyze photos and censure what may looks like unrelevant and/or forbidden content such as nudity. And a lot more 🤩.

I discovered CIDetector from the CoreImage frameworks and in this small article, I am going to show you what I discovered during my quick experimentation.

Did you already use Core Image before?

Core Image is a powerful API built into Cocoa Touch.

Personally, I don’t use it everyday. But it’s interesting to see that there are so incredible and useful features inside.

CoreImage — CIDetector: なにこれ?

CIDetector is an image processor object. So you have your CIDetector object. You give it any images and the CIDetector object will find for you information in your image: Those information can be:

  • Faces
  • Rectangles
  • QRCode
  • Text

For each of those information, you can again find some specific information. For face for exemple, you can find out if:

  • There is a smile or not
  • The user is blinking or not
  • And probably other informations ( cf documentation apple )

Smile detection

So, we have this image and we want to find all the faces that appears. After than, we want to see if the person is smiling or not

How can we do it? Dō yatte suru no?

#1 Use CIDetector for detecting faces

let detector = CIDetector(
ofType: CIDetectorTypeFace,
context: nil,
options: [CIDetectorAccuracy: CIDetectorAccuracyHigh]
)!

#2 Use CIDetector features for detecting Smile

let faces = detector.features(
in: CIImage(image: yourImage),
options: [CIDetectorSmile: true]) as? [CIFaceFeature]

#3 Then, do whatever you want with the results you got

For face in faces {
// face.bounds, face.hasSmile, face.mouthPosition, etc..
}

Did you already have to analyze pictures for some reasons? Let me know on my twitter @aymenworks.

I presented this small subject during the potatotips #54, a meetup where iOS and Android engineers share their tips, as simple as that 😅.

You can find the slides there.

Thanks for reading 🚀

--

--