Can Siri finally understand more than the predefined Intents?

Reading Time: 3 minutes

GUI is so 90’s

Lately, I find myself increasingly annoyed to have to use my phone to perform boring and recurring tasks like look up the quote of a special cryptocurrency. Especially in some circumstances like eg when I’m home. At home, I want to feel foremost comfortable. This is hard to achieve when I have to get up to search for my phone, once again. Wouldn’t it be nice to just ask in the room and get the answers?

Wait but there is Siri, right? So what can it do for me and what can developers achieve with it today?

Hey Siri, are we there yet?

Turns out that SiriKit offers a set of predefined intents ready to be used. That’s a start. But those cannot handle my specific requests and I guess a lot of others as well. To be fully usable something like custom parametrized intents would be nice. I would like that Siri understands something like:

“Hey Siri, what’s the price of <your cryptocurrency> in <your currency>?”

To be fair. When asking this for Bitcoin and USD you would get an answer. Depending on how the question is formulated Siri would start the Stocks app in preview or get something from the Web. But when trying to get an answer for other rather “unknown” cryptocurrencies, Siri struggles. I totally get that this question may seem fairly simple for a human to process but it is certainly not that simple for Siri to filter out the domain in question and start the “right” app for the job.

Hence I would also be satisfied with something in the form of a QA for the beginning:

> “Hey Siri, cryptocurrency price”

< “For what cryptocurrency?”

> “Bitcoin”

< “In what currency?”

> “USD”

In that way, developers could assign to specific keywords (in this case “cryptocurrency price”) input dialogues to get params to process those and render a response. Something similar to URL schemas

After looking a bit deeper I stumbled upon an interesting blog post which clarified it for me:

There are also some hands on blogposts how to set up “Custom Intents”:

I’ll just wait here then

Since iOS 12 it is possible to create a custom Intent in the form of an Intents.intentdefinition file. Here app developers can specify parameters which the app can process. To stick with the cryptocurrency example: When the user is searching for a price of a cryptocurrency inside an app, the app can create an Intent with the parameters already filled out. Eg. Show the price of Bitcoin in USD. Furthermore, the app can now “donate” this specific Intent (already parametrized) to the system. This “donation” would appear on the lock screen and as a shortcut ready to be used.

This means one could assign a custom Siri voice command to trigger this Intent. It also means that if you have 5 favourite cryptocurrencies and 3 favourite currencies you would have to go through this step 15 times inside the app. Afterwards, you would need to assign 15 voice commands to those donations.

Well, honestly this is not the way I would like it to be. But it’s a start and I hope that with iOS 13 we get something like parametrized Intents for the user to trigger.

My expectations from UIKonf Berlin 2019

Reading Time: 3 minutes

After more than 6 years this will be my first conference. This time as a guest. I remember well the moments of my presentation in Baku, Azerbaijan in 2013. Stressful experience, a lot of sweat, with my suit that was one number larger than it should be. That was 2013 and the topics were economics and management. Now I’m an experienced iOS developer and going to Berlin for a conference that should give me new ideas for personal development, but also to give me directions how to think for the future challenges.

Conference generalities

The venue of the UIKonf is RADIALSYSTEM V in Berlin.

From 26-29 May 2019 in Berlin, some serious iOS developers will come from big companies that will present their experiences in the iOS World.

The motto of the conference is simple:

“UIKonf is an independent conference for serious iOS developers”

Speakers

For this conference, the organizers decided to have only female speakers. All will come from big companies and with big experiences. We have female iOS developers who work or worked for Uber, Slack, LinkedIn…

Part of the speakers

Costs

The travel expenses are the cheapest for this conference viewing from my perspective. Fortunately, we Macedonians have direct flight connection to Berlin. The cost is around 150€.

The ticket to attend the conference costs 539€.

They are also offering free tickets to members of underrepresented groups in tech (this includes disabled people and generally people who are unable to attend without financial assistance).

The accommodation is something that is not completed from my side. The hotels near the conference building costs around 100€ per day.

Schedule

This conference has an excellent organization.

The first day 26 May is the day for social meetings, time for visiting places in Berlin, acknowledge the other attendants and will finish with a kick-off party where the people will take their badges.

Next two days 27 and 28 May are the days when the speakers will expose their presentations. 30 minutes per presentation with question and answers part of the end of each presentation.

29 May is the last day. This is unofficial part of the conference and the people can meet with the sponsors, dissect code problems in lab sessions with experienced experts, signup for a workshop or just hang out and code with new friends.

Expectations

I have big expectations. The names of the speakers, their experience, the companies they worked for are guarantees for something big and something good.

I’m going by myself and expect to meet new people, make new friendships, share experiences.

Berlin as a city is something new for me. I had heard very good things of Berlin. Good parties, restaurants, parks, two different sides of the city. I have friends that live/lived there.

With the correct size of my suit this time – it should be a good experience in my life.

WeAreDevelopers World Congress 2019 – Expectations

Reading Time: 3 minutes

Why this one?

I’m a mobile developer. In specific an iOS developer. So guess what, I love writing code either in Swift or Objective-C for Apple devices. When searching for a conference to attend in 2019 I rather had the WWDC in mind. There is one drawback, however you need some serious money and luck to get there. But in general, there are plenty of other good iOS conferences to attend (NSConf, CocoaConf, …, you name it!). 

For an overview and a brief description of iOS conferences in 2019 I found this page from Hacking with Swift very informative.

There is a very nice overview diagram of iOS conferences depending on your location and/or budget in an old post (2017) at raywenderlich.com. Although outdated, most of these conferences will also happen in 2019 and pricing and location rarely (dramatically) change.

Wait, but the post is about your expectations for the World Congress 2019 in Berlin. So what happened?

Ok, let’s get to this. When searching for iOS conferences in 2019, this conference somehow slipped into the search results. And it did so because Steve Wozniak was speaking there. Well, which iOS/Mac developer wouldn’t want to see and hear the legendary Steve Wozniak, right?

Once on the WeAreDevelopers homepage I realised that Steve Wozniak was giving a talk there a year earlier. So, 2018 in Vienna. Furthermore, this conference isn’t iOS or even mobile specific. That’s another drawback. Anyway, once on the page I skimmed through this year’s speakers. Here are some of them:

  • Rasmus Lerdorf – Inventor of PHP
  • Håkon Wium Lie – Inventor of CSS
  • Andreas M. Antonopoulos – Author, Mastering Bitcoin

Well, maybe Steve Wozniak is not there, but there are a lot of high quality people from IT in general. People who had and have a tremendous impact on millions of developers out there. One can argue about the weaknesses and strengths of PHP, CSS, Bitcoin. But one is for sure. With all downsides, these technologies are used and working on a daily basis. Furthermore, there are vibrant communities behind them.

Ok then. I’am going because I would like to see a broader picture of tech then just a little piece of it.

Btw there are mobile related talks anyway.

Expectations (TL;DR)

  • Experience some good talks from people with impact
  • Get a feeling where we’re headed in the future
  • Maybe see some demos/MVPs on AR/VR, Robotics
  • Get answers where bitcoin and blockchain are going and what the state of disruption is
  • Socialising

Facial recognition technology and its effect on health insurance companies

Reading Time: 6 minutes

When I was young, I used to make fun of people who believe that other people can read their future out of hand, and I still make fun of them, but during last year I figured out that with the help of technology, it is now easy to read people’s hidden presence.

As an iPhone user, yesterday, while I was unlocking my mobile using facial recognition, I took a moment of thinking and I found how far we have reached using this technology within all business fields especially in health insurance companies. So as being one of 250,000,000 annual customer of apple (250,000,000 average of iPhone devices sold per year), I became really surprised and fascinated about facial recognition technology to the point I opened up my mouth.

Off late, apparently, that our face has not just become as a way of recognizing ourselves. In the era of computer, which sees more than we do by far and learns faster than us by far. Most important, it sees some really personal secrets. For a second my mind got frozen from happiness as finally I can imagine what the real emotion for Mona Lisa was 😀, RIP da Vinci.

Yeah…. she looks neutral!

In such an amazing computer simulation experiment at Stanford University, two researchers made the computer study more than 35,000 photos from self-identified and heterosexual people from public dating sites and after training the computer, the algorithm was able to correctly differentiate gay and heterosexual men with an accuracy reaching 81% of and 71% accuracy for women. Just from the face, the device (artificial intelligence) becomes able to differentiate, identify and predict the sample based on sexual orientation. At the end of the research, the researches wrote “our findings expose a threat to the privacy of men and women”. Moreover, if you show the device 5 pictures of the same person, its recognition accuracy jumps up to 91% and this just through the face. Such a surprisingly percentage as human mind can only reach 60% of recognition with the same number of photos. From this study, we can see, there is a direct positive relationship between increasing the sample size and level of recognition accuracy.

So from what has been stated above, we can reach a conclusion that by increasing sample size, the accuracy level increases. We can prove our conclusion by the result declared in China. This development of technology is now able to recognize the citizens in a heavily populated country like China in less than 3 seconds. Out of 1,400,000,000 citizens (sample size), this technology is able to recognize any specified citizen within the mentioned duration. Moreover, this technology has applications in different fields as many criminals have been captured using this technology within the same rate. This occurs at present. You can imagine if China develops this technology, which is currently being done, this duration will be minimized and the accuracy level will be maximized according to the above stated conclusion.

Not quite there yet in the real world, but Hollywood already is 😉. ZOOM! ENHANCE!
Selfie time: new AR technology in KFC Original+ Beijing will know what you want to eat 🍔🍟

Last year, the biology scientist Craig Venter, one of the greatest scientists in the field of biology, specifically in DNA sequencing, I have ever seen published a research in PNAS journal, illustrating that using DNA, the device can recognize 80% of the faces. If you give the device 10 DNA, for example, it can expect how 8 persons faces will look like. So, with the sample of 1000, it reached an accuracy of 80% in the expectation. So, if this is applied to eg China, you can imagine how high the percentage of accuracy will be in future. Again our face reflects our gene. If you see this issue as not a big deal, I won’t agree with you. As in 1950s we have exerted every effort just to know the structure of the DNA. So, in a nutshell, what ever is hidden in our genes, we can see it on our faces.

Official Apple TouchID Icon. It seems to be happy…

Some readers may say, it is just an identification technology similar to Touch ID and it is not going to save Titanic. I will tell them, you are right, it is not gonna save Titanic, but it is gonna save many health insurance companies. This technology will have a great effect on health insurance companies as from 30-40% of genes’ diseases (genetic disorders manifest craniofacial abnorma) have effect on faces and skulls as Hart and Hart stated in 2009. Health insurance companies can know a lot about your health without making any blood test, just with your facial recognition analysis, which you do not have any will in. This will help insurance underwriter to minimize the risk as much as possible.

To sum up, just using facial recognition technology, the devices can declare our characteristics, orientations, and also behaviors. So are you still interested in iPhone or you will turn to Android

References