Dieser Podcast ist eine initiative der Development Community des DOAG e.V.

[English] Heli Helskyaho @ OUGN 2023 in Oslo - Machine Learning and Family Business

Shownotes

In this Episode we talked about the evolution of Machine Learning and how it behaves in the Oracle Environment. There are a few examples on how to imagine ML in our daily worklife.

We also talked about Helis family and how to deal with two familymembers as her own employees.

Have fun and let us know what you think!

Heli auf Twitter: @HeliFromFinland Devs On Tape auf Twitter: @devsontape Kai Donato - kai.donato@mt-ag.com - Twitter: @_KaiDonato Carolin Krützmann - carolin.hagemann@doag.org - Twitter: @CaroHagi

Dieser Podcast genießt die freundliche Unterstützung der Deutschen Oracle Anwender Gruppe (DOAG e.V - https://doag.org)

Transkript anzeigen

00:00:00: [MUSIC PLAYING]

00:00:03: Hello, and welcome to another episode of Devs on Tape,

00:00:15: this time from Norway, from the OUGN conference in Oslo,

00:00:21: ladies and gentlemen.

00:00:23: On today's podcast, we are thrilled to welcome

00:00:25: a true pioneer in the world of technology.

00:00:27: She's already laughing.

00:00:29: She is an accomplished entrepreneur and international speaker

00:00:32: and a recognized expert in the field of database management.

00:00:35: With over 25 years of experience in the industry,

00:00:39: she has established herself as one of the most respected voices

00:00:42: in the world of technology.

00:00:44: As the CEO of Miracle Finland OI,

00:00:47: a company that specializes in database management

00:00:49: and data integration solutions,

00:00:51: she has been the forefront of digital transformation

00:00:54: that is sweeping across the world.

00:00:56: She has helped countless organizations

00:00:59: leverage the power of data to drive innovation and growth.

00:01:02: And her insights have been sought after

00:01:05: by businesses of all sizes.

00:01:08: She is also an active member of the global technology

00:01:11: community, serving as the president of the Finnish Oracle

00:01:13: User Group and a board member of the International Oracle User

00:01:16: Group.

00:01:17: She has been recognized with numerous awards

00:01:19: of her contributions in the industry,

00:01:21: including the prestigious Oracle Ace Director Award.

00:01:25: Today, we are honored to have she join us to the podcast

00:01:29: to share her knowledge and insights

00:01:31: on the latest trends in technology.

00:01:33: She challenges the facing the industry

00:01:36: and the opportunities that lie ahead.

00:01:38: So without further ado, let's dive in and welcome Heli,

00:01:42: oh, that's hard, Hels Ki-Yaho.

00:01:45: - Almost. (laughs)

00:01:46: - To the show.

00:01:47: Hello, Heli. - Hi.

00:01:49: - We stick to Heli, right?

00:01:50: - Yes, Heli from Finland.

00:01:51: - Heli from Finland, that's the way we can work with that.

00:01:55: All right, yeah, I'm very thrilled to talk to you today in the podcast episode.

00:01:59: And yes, unfortunately, Caro didn't make it to Oslo, so we are just the two of us.

00:02:05: And we have plenty of things to talk about.

00:02:08: So maybe you can just introduce yourself besides my introduction to you and tell us

00:02:13: everything, where it starts, where did you start going into the technology branch and

00:02:18: everything about maybe your company, Myrica Finland, or why?

00:02:23: Well, first of all, I think it's a surprise to everybody that I am on IT because that

00:02:28: was not the plan.

00:02:30: I never thought I would be on IT.

00:02:32: It was not something that I even considered.

00:02:35: So I was actually studying mathematics.

00:02:36: I was thinking I will study mathematics or economics.

00:02:39: So I'm still actually studying economics.

00:02:40: I haven't graduated from there yet, but I was studying mathematics and then I took some

00:02:44: minor courses on computer science and they made no sense to me.

00:02:47: I was like, come on, this can't be true because it felt so hard.

00:02:51: Studying has always been easy to me.

00:02:53: I always loved studying.

00:02:55: I was like, this is so hard.

00:02:56: How can it be?

00:02:57: And I took another course and another course

00:02:59: and another course until I finished all the minor courses.

00:03:02: And then I was thinking, okay,

00:03:03: I still don't understand this computer science.

00:03:05: So I changed my major to computer science.

00:03:08: Well, it was a funny conversation with a professor

00:03:10: from mathematics because he was like,

00:03:12: "Heli, are you sure?

00:03:13: Because you are good in mathematics,

00:03:15: but you are no good in computer science."

00:03:17: - Yeah, just in the courses, yeah.

00:03:19: you still want to change to computer science? I said, Yes, I do. Because I want to understand

00:03:24: there must be something in computer science. So I can't buy this. It can't be this stupid.

00:03:29: It has to be something that I don't understand yet. Yeah. And yeah, it was because databases

00:03:34: was my thing. Until that it was like operating systems, networks. Oh my god, I still don't

00:03:38: like them. Yeah, sure. The basic stuff, right? Yeah. But databases and data, that was something

00:03:44: that I was like, Oh, now I got it. I got it. This is this is what I want to do. So I graduated

00:03:49: with a master's degree from computer science and that's it.

00:03:53: And then you just opened a new company, right?

00:03:56: Yeah.

00:03:57: Directly or was there some time between?

00:04:00: I was working for Oracle for a while, by the way, before this.

00:04:04: Oh, yeah.

00:04:05: I didn't get this, no.

00:04:06: Yeah.

00:04:07: So from Oracle, I started my own company in 2000 and that was called Kandamestarit.

00:04:12: It means like base masters or database masters, something like that.

00:04:16: We started that in 2000 and I worked for that for quite many years.

00:04:19: I actually still have that company, but then I joined Miracle in 2010.

00:04:23: Okay.

00:04:24: So, and then the time starts when you were presenting at conferences, right?

00:04:29: So, I mean, almost every schedule of a conference I see "Heli from Finland".

00:04:33: Yeah.

00:04:33: Just avoiding to say again.

00:04:35: That's what everybody says.

00:04:37: Yeah.

00:04:38: Actually, I started in 2007.

00:04:39: So, when I started to be active in EOUC, the EMEA user community,

00:04:44: So then I started speaking.

00:04:46: I went, my first presentation was in Croatia,

00:04:48: the Croatian music group.

00:04:50: And that's how it all started.

00:04:51: I realized I really liked this.

00:04:52: So I have been teaching a lot.

00:04:54: So throughout my career, I have always been teaching.

00:04:56: So I kind of like teaching,

00:04:58: but I didn't really present in conferences until that.

00:05:02: So 2007, and that's how it all started.

00:05:05: Suddenly I was everywhere presenting

00:05:07: because it was so much fun.

00:05:09: - I was just thinking about what I did on 2007.

00:05:12: I don't think anything near IT stuff, right?

00:05:15: (both laughing)

00:05:17: And now you ended up writing numerous books, right?

00:05:19: So my opinion on that is a little bit different,

00:05:24: I think, because writing books in the IT

00:05:26: is like writing text, getting everything in place,

00:05:30: publish it, like print it, publish it,

00:05:32: and then sell it and everyone reads.

00:05:34: And at the time of being, when this is ready

00:05:37: and laying on my table, it's already outdated, right?

00:05:39: What are your opinions about that?

00:05:40: So you're grinning, right? - Yeah, that's true.

00:05:42: That's true.

00:05:43: But my first topic was about database designing.

00:05:45: That's never old.

00:05:46: So it's always the same.

00:05:47: Okay.

00:05:48: It also talks about data modeler, which has changed quite a lot since 2015 when it came

00:05:52: out, but the book came out 2015.

00:05:55: But it hasn't changed that much because the designing work is still the same.

00:06:01: So I think that's still valid.

00:06:03: Still valid.

00:06:04: And the second one is about SQL and PL/SQL.

00:06:06: Where would that go?

00:06:07: There's a lot of new features coming, but the old features are still there.

00:06:11: So it's still okay.

00:06:13: Then the two other books are about machine learning.

00:06:16: The first one about machine learning to Oracle professionals.

00:06:19: So that introduces how Oracle is handling machine learning.

00:06:22: So what kind of tools you have and how to use them and so on.

00:06:26: Well, that's still valid, but there's much more tools available now than they were at

00:06:29: the time.

00:06:30: Yeah, sure.

00:06:31: And then the latest one with Adrian Ping is about Apex.

00:06:35: Oh yeah, Apex is never old too, right?

00:06:37: It's never old.

00:06:38: Apex is young forever.

00:06:40: Now, seriously, that's interesting because we are talking about APEX, OCI, machine learning,

00:06:46: all that kind of things.

00:06:47: So, as an APEX developer, what should you know about OCI?

00:06:50: What should you know about machine learning and so on?

00:06:52: So, I don't think it will be old very soon.

00:06:54: Yeah.

00:06:55: I mean, I have my own history with that because there was Jürgen Ziem writing a book about

00:07:00: APEX, like the big compendium of everything with APEX.

00:07:04: And I was asked to write the chapter about mobile.

00:07:08: And it was right before the Universal Theme was released and there were rumors and saying,

00:07:12: "Okay, you have one theme for everything."

00:07:14: It was responsible.

00:07:15: And I just wrote like pages about the mobile theme, jQuery mobile stuff back then.

00:07:21: And at the time when the book was released, the Universal Theme with Apex 5 was there

00:07:25: and it was outdated.

00:07:27: Just the passage where I was saying, "I guess there might be a theme which is completely

00:07:33: fully responsive to smaller screens and there might be something cooler than

00:07:38: what you're reading right now and it happens, right? So this is my opinion

00:07:42: that's the reason why I'm never... I have similar things there as well that are

00:07:46: already outdated but... I mean you can update that, you can have a sequel of the

00:07:49: book and you can talk about the newer stuff and if you switch over to

00:07:53: universal theme for example you might have luck for the next couple of

00:07:57: years that your book is still up to date, right? But are you just writing books or

00:08:03: Or do you write blog posts or like online media?

00:08:08: I used to.

00:08:09: I have a blog, but I haven't had any time to write lately.

00:08:11: So I'm always planning I will start writing again, but then I'm too busy with everything

00:08:15: else.

00:08:16: Yeah, you have to be continuous working on that blog, right?

00:08:18: So that's the difference.

00:08:19: Exactly.

00:08:20: So if you're writing a book, you're just having your time whenever you have it, right?

00:08:24: And then write your texts and chapters and every...

00:08:28: At some time the book is complete and you can publish that.

00:08:31: If you open a new blog and you're just writing a new article and then a few months nothing,

00:08:36: then it's not a good blog, right?

00:08:37: So we are on the same page with that.

00:08:40: So you already said machine learning.

00:08:43: I will not out myself as not into that topic.

00:08:47: Maybe you can use easy words for me and for our listeners, maybe for people who are not

00:08:52: familiar with machine learning.

00:08:54: What's the essence of machine learning?

00:08:56: How far are we already?

00:08:58: And how should I start learning about machine learning if I want to?

00:09:02: Yeah, the question of how far is a difficult question, because it depends on how you measure

00:09:07: it.

00:09:08: So there's a lot of cool things happening at the moment at the area.

00:09:12: So you know, chat GPT and so on, all these kind of things.

00:09:15: So they are cool, but they are a little bit, I worry a little bit because people don't

00:09:20: realize that they are actually giving their data away when they use this kind of tools.

00:09:25: So they don't realize that actually the code that they asked the tool to build for them

00:09:29: includes some actually highly confidential information about their database or their

00:09:34: data or something like that.

00:09:35: So that kind of worries me a bit.

00:09:37: And secondly, it's not a genius.

00:09:40: It's just a tool that produces words, a text.

00:09:46: So it can just be giving you references to documentation or packages or whatever that

00:09:52: don't even exist.

00:09:53: Yeah, I heard about that too.

00:09:56: Yeah, so it happens very easily.

00:09:58: So it gets confused quite easily because it's not really that intelligent.

00:10:02: But we are going to that way all the time.

00:10:04: So there are now we have, we have a GPT-4, we have other things that are already available,

00:10:10: not just the three that I was mentioning earlier.

00:10:13: But I think we are very far with it, but not as far as people seem to think.

00:10:21: So it's not as intelligent as some people think it is.

00:10:24: But yes, I think machine learning is something that everybody should know about.

00:10:28: So if you don't know about it, you are losing a lot.

00:10:32: So yeah, we are talking about ML and nowadays, when we talk about JetGPT and so on, everyone

00:10:39: is talking about AI.

00:10:41: And I think everyone should know that one is not the same as the other, I think, but

00:10:47: the groundwork beyond that might be the same, right?

00:10:50: So we had a couple of episodes about the chat GPT stuff

00:10:56: and the GitHub co-pilot and everything

00:10:59: that is possible today for our vision, right?

00:11:02: So you're typing some text and you've got answers

00:11:04: and how impressive is this?

00:11:05: And this is the way why people are getting their hands

00:11:09: on machine learning parts,

00:11:11: just a little bit, the tip of the iceberg.

00:11:14: But I'm interested how you can use machine learning

00:11:18: in a corporate environment, not just for the end user who's

00:11:20: writing, texting, get some documentation summarized

00:11:23: or something.

00:11:25: And I think you could give us some insight how

00:11:28: you can use machine learning to, yes,

00:11:30: bring your corporate business logic to better results.

00:11:35: Yeah, so probably your data is in Oracle database, I assume.

00:11:39: Sure.

00:11:40: So then you have plenty of options.

00:11:42: So we have machine learning in the database.

00:11:45: We have there, we have, you're able to do it with SQL and PL/SQL.

00:11:49: You're able to do it with Python and you're able to do it with R. So you can create, you

00:11:54: can train, you can use any kind of models from the database, just like that.

00:12:00: But how you can do it, you can use PL/SQL packages, which is something that you probably

00:12:04: like because you are familiar with PL/SQL.

00:12:07: But if you're not, there are other options as well.

00:12:10: So you can use the thing called AutoML, which is a quite simple tool in the Oracle Cloud

00:12:15: Infrastructure.

00:12:16: So you just select the dataset that you're going to use and you define what's the target

00:12:22: that you want to predict.

00:12:23: Let's say you want to predict affinity card, for example, one of the classical examples.

00:12:28: So will this customer take an affinity card or not?

00:12:31: So you select that as a target, then you select this classification problem.

00:12:37: Then you define the ID, the identifier of the customer, and that's about it.

00:12:43: You press start the process, and it will create you the models.

00:12:46: So for each algorithm, it has in a database.

00:12:49: So for classification, it has plenty of algorithms.

00:12:52: For each algorithm and for your data set, it creates a model.

00:12:55: So you will have as many models as algorithms you chose.

00:12:59: And based on the metrics that you have chosen, so it gives you a list of possible metrics

00:13:03: that are for classification problem.

00:13:06: You choose one metric and based on this metric,

00:13:09: it will just put them in order.

00:13:10: So the first one is the best and so on and so on.

00:13:13: You can also add more metrics.

00:13:15: If this one doesn't tell you enough,

00:13:17: or maybe two is just the same,

00:13:19: you can add more metrics and compare those to each other

00:13:22: and decide which one is the one that you want to use.

00:13:24: And you just deploy it.

00:13:26: - Just click and just provide everything.

00:13:28: That sounds way too easy for me.

00:13:30: - It is, it is super easy.

00:13:32: And all these models that it creates

00:13:35: are stored in a database.

00:13:36: So from APEX perspective, it's very easy to call them.

00:13:39: You just use the apply procedure to call them.

00:13:42: - Okay.

00:13:43: - Very simple.

00:13:44: - So, and how do you train those models afterwards?

00:13:47: So you just say AutoML,

00:13:49: you told me the classification starts,

00:13:51: the models are being created,

00:13:54: and then it's working on the real data we have

00:13:56: and which will grow and grow and grow, right?

00:13:59: - Yes.

00:13:59: - And then this model will grow also.

00:14:01: - Yes, and Oracle also have now new tools

00:14:03: for actually following both the model and the data.

00:14:07: So if the model changes too much,

00:14:09: it will tell you and you need to train it more.

00:14:12: Or if the data changes so that the model

00:14:13: is not good for that data anymore,

00:14:15: it will also warn you about that.

00:14:17: So that's kind of a new thing.

00:14:18: Now it's only in a way that you have to call yourself,

00:14:22: but I'm sure it's gonna be like a service

00:14:24: that you just go and click like AutoML.

00:14:26: That's probably coming soon.

00:14:27: So that is very useful.

00:14:30: - That sounds promising.

00:14:31: Maybe we have a deep dive in the 23c database,

00:14:36: which just came out.

00:14:37: And we had a talk with Jared Wentzel,

00:14:39: which was the last episode, and he said,

00:14:41: there are many nice new features.

00:14:43: And if we are expecting some more regarding ML,

00:14:46: that should be in there, I think.

00:14:47: So you talked about different tools we have to use that.

00:14:52: Now the question in the Oracle environment,

00:14:55: is this for free in your database license?

00:14:57: - Yeah, the database is free.

00:14:58: - Yes, the one that is in database is for free.

00:15:01: And also auto-analyze for free.

00:15:04: But there's also services that are not for free,

00:15:06: like a data science service,

00:15:08: which is a Python environment in the OCI,

00:15:11: where you can just build your machine learning models

00:15:14: using your favorite Python libraries.

00:15:16: - Oh, great.

00:15:17: - So that's actually quite interesting.

00:15:19: And then you can publish those environments and your models

00:15:22: and whatever you want to publish from there,

00:15:24: and you can use them from other tools from Oracle.

00:15:27: Like if you publish conda environment,

00:15:30: you can import that Oracle database.

00:15:31: So you will have this conda environment in your database.

00:15:34: Or if you have a model that you will publish from there,

00:15:39: you can call it from other tools as well.

00:15:42: - So you can exchange the environments

00:15:44: with everything you achieved.

00:15:46: - Yes, yeah.

00:15:47: I think the coolest thing with Oracle environment

00:15:50: is that you can just integrate all your tools

00:15:52: easily to each other.

00:15:53: So you can use whatever is the best

00:15:55: for this particular use case, you can just use this

00:15:59: and it's not wasted.

00:16:00: You can use it from others as well.

00:16:03: And one of the payable tools is also Analytics Cloud.

00:16:06: Or Analytics Cloud, that can also do AutoML nowadays.

00:16:09: It can do all kind of machine learning things.

00:16:11: Actually, I will talk about that tomorrow.

00:16:14: So there's so many machine learning things

00:16:16: in Analytics Cloud, which is good

00:16:18: because that's the tool for the business people as well.

00:16:21: Kind of the people who understand the data,

00:16:23: not just technology.

00:16:24: Well, they are probably not too techies,

00:16:26: but they understand the data.

00:16:28: So this is a tool for them to be able

00:16:31: to use machine learning,

00:16:32: either the models that the data scientists

00:16:34: has created in a database,

00:16:35: so they can use those models with their data,

00:16:38: or they can use AutoML or some other functionalities

00:16:41: in OAC to create their own models.

00:16:44: - And just adjust what they need to do,

00:16:47: be changed for their reason.

00:16:49: Okay, so as always,

00:16:51: When I hear about new special technology,

00:16:54: I have the technology, I want to try it,

00:16:57: and then I'm searching for a use case, right?

00:17:00: So I think you need very big data sets

00:17:04: to use ML properly or--

00:17:06: - Not really, not really.

00:17:08: Actually, if you think about it,

00:17:09: more important is that your data set has good quality data

00:17:12: than that you have billions of rows of data.

00:17:15: So if you have a lot of data that is not good quality,

00:17:18: it's actually bad for machine learning

00:17:19: So it's learning different wrong things.

00:17:22: So how do I get information about the quality of my data?

00:17:26: So I know it from a different perspective.

00:17:28: If I have good data, you have this third normal form.

00:17:34: I don't know the exact translation of that.

00:17:36: But you have everything in place, a good data model for proper use in APEX application

00:17:40: or something.

00:17:41: Is it also good for machine learning automatically?

00:17:44: Or do you have a view on data modeling that you might have some ML in the future so you

00:17:51: just get some best practices in there?

00:17:53: Yeah.

00:17:54: One of the biggest things is nulls.

00:17:56: So if your application allows nulls, you might end up having a lot of nulls in your database.

00:18:01: So what's the point?

00:18:02: Because it doesn't tell anything.

00:18:04: So that's probably the biggest problem with this kind of things.

00:18:09: Another problem I often see is that you have a column that you have used for a purpose

00:18:13: And one day you are just too lazy to add a new column.

00:18:16: So you start using the same column for another purpose.

00:18:20: So the data is not consistent throughout their time.

00:18:26: So that kind of things can really affect badly.

00:18:29: Do we have another example for us where you personally used ML to reach a goal in a project?

00:18:39: Well actually there's several use cases where machine learning has been used.

00:18:43: It depends on your business where it's most useful.

00:18:46: But if you want to start with auto ML,

00:18:47: which is the easy part, you have two things to do.

00:18:50: You can do classification problems or regression.

00:18:53: Classification is that you put things into classes.

00:18:56: Like for example, if you're sending offers

00:18:58: to your customers, you can predict which of these

00:19:01: will end up with a deal and which will not end up

00:19:04: with a deal.

00:19:05: So you can put more effort to those that you know

00:19:08: that most likely will end up goodly and not badly.

00:19:12: That sounds so surreal to me that a machine

00:19:14: will give me information about how a customer reacts.

00:19:18: I think that my customers, that I can have a very good view

00:19:22: on how they will react on what I'm sending them.

00:19:24: How does a machine do that?

00:19:27: - Yes.

00:19:28: - And that's a magical moment.

00:19:29: I do not know if I can trust something like that

00:19:33: because I don't know how this works,

00:19:36: how this machine is predicting something

00:19:39: based on what I'm giving it.

00:19:42: So maybe you can help me with that.

00:19:44: - Yes, so that's a thing that people want to see

00:19:47: what why was this customer chosen as not potential?

00:19:51: - Okay.

00:19:52: - And you can see it from the data.

00:19:53: So why did the model make this decision?

00:19:57: So that's something the model tells you if you so want,

00:20:00: or why was this customer chosen as a potential customer?

00:20:04: But the truth is that model is never 100% sure.

00:20:08: And if you end up with a model that claims to be 100% sure,

00:20:12: you can be sure that it's not true.

00:20:13: There's something very fishy there.

00:20:15: - So what you're doing is that you're feeding information

00:20:19: into the system, which is telling,

00:20:21: I was sending this offer to the customer.

00:20:23: Those are the metrics of the customer, for example.

00:20:26: And if something, or if this offer

00:20:29: is not being successful to the customer,

00:20:32: I'm giving the information to the system

00:20:35: and the system itself decides for the next customer

00:20:38: if it's the same condition or how can I just imagine?

00:20:43: - The model is built using the features

00:20:45: like in a database, the columns.

00:20:46: So what do you know about a customer?

00:20:48: So based on this customer information,

00:20:50: the model creates ideas about what features are the ones

00:20:55: that are important for this decision.

00:20:58: So for example, if the customer is already our customer,

00:21:02: it's probably a feature that will have high impact

00:21:06: on the decision and so on.

00:21:07: So based on these features, it decides if this customer is potential or not.

00:21:12: So it's not this particular, it's not the name of the customer or anything that defines

00:21:17: if the deal will be closed or not.

00:21:18: It's the features.

00:21:20: So customers who are similar to this have been signing the contract and customers who

00:21:26: are very much similar to the other side are those who will not sign the contract.

00:21:31: So your job is to find these features that will tell enough about the customer for the

00:21:36: model to make the prediction. Most likely the first time you try it will not give you

00:21:41: good predictions because you don't have enough features about the customer. You have to add

00:21:45: more data. What kind of customers these were. So in your data set you have customers who

00:21:50: ended up doing the deal and customers who did not. And based on this it builds the model.

00:21:57: And if this data that you use for building the model is not good enough or it doesn't

00:22:02: have enough information, the model is not working.

00:22:05: Okay. So we have things like the size of a company,

00:22:10: the geolocation of the company, maybe.

00:22:13: Yeah, could be.

00:22:14: And as much columns I have,

00:22:17: the better the model will be in the predictions afterwards.

00:22:21: And also the columns that you have, you might need to transform them somehow.

00:22:25: So maybe a classical example is about predicting the house price,

00:22:30: which is, by the way, regression.

00:22:32: So we were talking about classification.

00:22:33: Another very typical use cases is regression.

00:22:36: So for regression, we are trying to predict numbers, like for example, the house price.

00:22:41: So how much would I get from my house if I sell it?

00:22:44: So one example of transformation is like, if I know my GP coordinates for the house,

00:22:50: so this is where it locates, it means nothing.

00:22:52: Okay.

00:22:53: What do you do with the coordinates?

00:22:54: You can't compare like your coordinates are higher than mine.

00:22:57: So what?

00:22:57: Sure.

00:22:58: It doesn't affect the price, but you can, you can transform the coordinates

00:23:02: to something more useful like location, like it's in the city center, in a very

00:23:06: popular area, or it's by the beach or by the lake or in the middle of the forest,

00:23:12: in the middle of nowhere, no services close by, you can transform it something

00:23:16: like that, and then it's useful for the model.

00:23:18: So the model can say that this is in a high end area, good area, price is higher.

00:23:25: This is in the middle of nowhere, the price is lower.

00:23:27: So that's the way our future looks like, right?

00:23:30: So if you have any like EmoScout, a company who is just rating the different

00:23:36: prices for locations you can move in.

00:23:41: And then if every single customer or every single company is using ML

00:23:46: and future to predict how good this location or how good this house will be

00:23:51: on the market and how good to sell it, that would be way easier than one guy

00:23:56: who's just estimating that, right?

00:23:57: Yes.

00:23:58: Yes.

00:23:58: I'm seeing that.

00:23:59: And also if you think about from your business side,

00:24:02: if all your competitors are doing this and you are not,

00:24:07: who's gonna win the competition?

00:24:08: - Of course, yeah.

00:24:09: - Yeah, because you still have the people

00:24:11: with a pen and paper and they are trying to do things.

00:24:13: And those are just pressing a button

00:24:15: and the machine is doing so many things 24/7.

00:24:18: - Yeah, I mean, but there have to be people

00:24:20: to understand that, right?

00:24:22: - Absolutely.

00:24:22: - You cannot let the machine do it and trust on it.

00:24:25: - The worst is if you let the machine do it and trust it,

00:24:27: Because then probably the data is completely wrong.

00:24:31: It's not transformed correctly for the model, for the transformation from data

00:24:38: algorithm to the model.

00:24:39: So it's transformed wrongly to the model.

00:24:42: The model is doing something completely different.

00:24:44: Classical example.

00:24:45: So there was a model that was predicting if the picture is of a dog or a wolf.

00:24:51: And it was really accurate.

00:24:53: People were like amazed.

00:24:54: How can it be so accurate?

00:24:56: But instead of really knowing if it's a dog or a wolf,

00:24:59: it actually knew it's a grass or it's snow.

00:25:02: So wolf were on the snow and dogs were on the grass.

00:25:06: So when there was a dog on the snow,

00:25:07: it was saying it's a wolf

00:25:08: and it didn't even look like a wolf.

00:25:11: So like, what are you doing?

00:25:13: So the machine learned wrong things.

00:25:15: You were thinking it learned right things

00:25:17: 'cause it was always correct until the day.

00:25:20: So you can never be sure what it actually learns

00:25:24: unless you check what it learns.

00:25:26: So why are you making your decisions?

00:25:29: - So you would go so far to say

00:25:31: this machine learning models

00:25:36: and the old technology around that

00:25:38: should just assist human people,

00:25:41: human beings to reach their goals.

00:25:45: It should just assist and not be the core of your company

00:25:49: or your goal, right?

00:25:50: - Yes, it should be assisting you

00:25:51: and you make the decisions based on what you get.

00:25:54: - That's great.

00:25:55: - If the decisions are crazy,

00:25:56: like one of my favorite examples is that

00:26:00: when I went to Los Angeles, it started raining.

00:26:02: I have been there once and it rained.

00:26:05: So the computer would say 100% sure

00:26:07: that when Heli lands to Los Angeles, it starts raining.

00:26:10: 'Cause it's not very common that it rains in Los Angeles.

00:26:12: They asked actually me to come back

00:26:14: because of my super skills.

00:26:16: So it started raining.

00:26:17: So the machine would say that if you want Los Angeles

00:26:20: to have rain, let Heli come here and it will rain.

00:26:23: So this kind of decisions would be nonsense

00:26:26: because the computer doesn't have common sense.

00:26:28: It only has the data and the algorithm

00:26:31: and it comes to a conclusion.

00:26:32: But if the data is,

00:26:34: there's only one row saying, "Helly, Los Angeles rain."

00:26:37: It's a fact.

00:26:38: - Yeah, right.

00:26:39: - But if you see outside, it's raining.

00:26:41: So I might have skills.

00:26:42: - Yeah, I mean, we are in Oslo right now,

00:26:45: and end of April and it's now snowing outside, right?

00:26:48: So maybe there's some conclusion on that too,

00:26:50: that Helly is moving.

00:26:51: - I have super skills.

00:26:53: traveling to San Francisco, it rains and you're traveling to Oslo and it snows.

00:26:58: Yeah, could be. Who knows? Maybe the computer knows better than we.

00:27:02: We will see. We'll see what the future brings us and how this evolves.

00:27:06: So another topic I want to talk about is one thing that is mentioned as a special

00:27:13: project for us in our preparation. It's about the tracking of COVID-19 exposures

00:27:18: in Finland. This sounds so amazing for me, what I read about that. Maybe you can

00:27:22: just talk about that and tell us what did you do for this COVID tracking?

00:27:27: Yeah, that was a fun project.

00:27:29: That really was.

00:27:31: So I get a message in Twitter.

00:27:34: So somebody I used to work with like 20 years ago,

00:27:37: he sent me a message that we are trying to track COVID cases in Finland

00:27:41: and we are using a blog and an Excel sheet,

00:27:44: but it doesn't seem to work anymore because COVID is growing.

00:27:47: And this is too many cases for for our technology.

00:27:51: So do you know anybody who could help us?

00:27:53: And we have no, we have zero budget.

00:27:55: I was like, Hmm, okay.

00:27:58: That was 11 in the evening or something like that.

00:27:59: I said, okay, let me sleep now.

00:28:01: And I will get back to you in the morning.

00:28:02: So in the morning I sent him a message.

00:28:05: I said, yes, we'll do it for you.

00:28:06: We will use the Oracle cloud, which is free.

00:28:09: And, uh, I will ask people who would volunteer to help with this.

00:28:12: So we will, we'll use Apex and we will build you a system.

00:28:15: And that's what we did.

00:28:16: So 10 days later we went to production.

00:28:19: So 10 days.

00:28:21: Yeah, that's a typical success story of Apex, right?

00:28:24: So we have two applications, actually.

00:28:26: We have the one for those volunteers who enter their data.

00:28:29: So they have their own credentials.

00:28:32: So everybody's using their own credentials when they log in and they enter their data.

00:28:36: And then we have another application that is open for everybody.

00:28:39: So anybody can go and see the COVID cases.

00:28:43: So where they are and so on and so on.

00:28:45: And for me, the best thing was, well, okay, after the 10 days, we've been improving it.

00:28:49: So it was not the, it was just the first release.

00:28:51: My part was the spatial thing because I love the multi-modal database.

00:28:57: I love the having graph and spatial and all this kind of things in one single database.

00:29:01: So I wanted to do the spatial.

00:29:03: And so I did all kinds of maps for us to be visualizing where are the hotspots in Finland for

00:29:11: COVID.

00:29:12: Now we don't track in Finland anymore because it's just like that.

00:29:16: But if you want to go to the website, you can go and you just select dates that are like a year ago

00:29:21: And you can see how it works, but it was a very fun project really

00:29:25: It's a second project I heard around this pandemic

00:29:29: Topic that that apex just just did a really good job. It did and OCI pre-tier as well and and actually

00:29:37: the local news

00:29:39: announced this this web page one evening and

00:29:43: Since the two hours from the news, it was 10 o'clock.

00:29:47: From 10 to midnight, we had 300,000 users in the web page.

00:29:51: Free tier web page.

00:29:53: Oh my God, I was sweating.

00:29:55: So me and one of my people who was also working on that,

00:29:58: we were just messaging with WatchUp to each other.

00:30:00: I'm dying, I'm dying.

00:30:01: What's going on now?

00:30:02: What's going on?

00:30:03: It worked.

00:30:04: Does it scale automatically even on a free tier, right?

00:30:06: Yeah, it works very well.

00:30:08: So we were like, "Whew."

00:30:09: Because you never know what happens.

00:30:11: because 300,000 is a lot in two hours.

00:30:14: - Yeah, sure.

00:30:15: - So that was a sweaty evening.

00:30:17: - Yeah, sure, of course.

00:30:19: But I mean, if you do something just out of nowhere,

00:30:23: like, yes, I can do so,

00:30:24: and you just use 10 days to build an application

00:30:28: with 300,000 users in the first two hours,

00:30:31: I can just imagine how proud you should be about that.

00:30:34: - Yeah, that was great.

00:30:36: And we learned so much also about APEX

00:30:38: and about many other things.

00:30:39: So it was a very educational project.

00:30:43: - And you were just mentioned by,

00:30:45: in the blogs of Oracle, right?

00:30:46: They use that for promotion, of course.

00:30:49: But yeah, it's sort of like tracking COVID-19 exposures

00:30:53: in Finland using Oracle Cloud

00:30:54: to develop an exposure tracking app in 10 days.

00:30:58: So this was like the best promotion for the cloud, right?

00:31:02: - Yes, yes, yes.

00:31:04: And I was so proud about my people.

00:31:06: So the volunteers who worked on this were from my company

00:31:09: and they did it on their own time.

00:31:11: - Oh yeah.

00:31:12: That's the motivation behind the Apex community, right?

00:31:16: How many people worked on that application then?

00:31:18: - We were actually, I think the maximum was four,

00:31:22: but most of us was two or three, most of the time.

00:31:25: - And there wasn't time for a big setup, right?

00:31:27: And get everything in place properly,

00:31:29: like a good project is just, okay, we need to do it.

00:31:32: We need to do it now.

00:31:33: And Apex is a perfect platform to get even good results.

00:31:36: - Luckily we knew Apex beforehand,

00:31:38: So we didn't have to start from the scratch.

00:31:40: - Yeah, sure.

00:31:41: But the good data model should be in place, right?

00:31:43: - Yes, it should, it should, it should.

00:31:45: - Exactly.

00:31:46: Yes, so let's talk about this conference right here.

00:31:51: Maybe you can tell us,

00:31:52: what are you talking about at the conference here?

00:31:55: - Yes, so first I have three presentations here.

00:31:58: The first one was about AutoML and AI services.

00:32:02: So AI services is also a payable thing,

00:32:05: but it doesn't really cost much.

00:32:07: It's really cheap to use.

00:32:09: There's different kinds of services for different use cases like document recognition or image

00:32:14: recognition or object detection, that kind of things.

00:32:18: So you can use the pre-trained models, just put your own picture there and it will do

00:32:22: the trick for you.

00:32:23: And if it's not good enough, you can always use the pre-trained model and train it further

00:32:28: with your own data.

00:32:30: And these AI services you can call from Apex, from data science service, for analytics cloud,

00:32:35: of the other tools you can just call these services, either the pre-trained or the ones

00:32:39: that you have further trained. I don't know what's the right word, so I use further trained

00:32:44: now.

00:32:45: Yeah, that's all right.

00:32:46: That describes what it is. So first I was talking about those and I had my younger son

00:32:52: with me here, so he was co-presenting with me. He's also working for me now. He's been

00:32:56: working for a year now. So he's been doing Apex, he's been doing OCI, all kinds of machine

00:33:02: learning things and especially a kind of Raspberry Pi, Coral, that kind of thing.

00:33:07: So I have a few questions about that, but maybe, maybe go ahead and I will ask the questions

00:33:12: how it should be, how it is to work for his own mother.

00:33:17: And then the other presentation we just finished with my son was about AutoML.

00:33:21: So AutoML as it is in OCI, but also as it is in Analytics Cloud.

00:33:28: And later today we talk about Analytics Cloud and what kind of machine learning capabilities

00:33:32: it actually has. So three presentations.

00:33:34: Three presentations, two of them with your son or three?

00:33:37: I think he will be on also on the last one.

00:33:39: Yeah, great. So there are so many questions I have. So how is it to be the boss of your

00:33:45: own son in the company?

00:33:48: The first question is how is to be boss for your own husband? Because my husband also

00:33:52: works for the company.

00:33:53: Oh, that sounds like a great combination in family life, right?

00:33:58: Yeah.

00:33:59: Is it hard for you to separate the job and the private time?

00:34:05: Sometimes it might be.

00:34:07: Because my husband is also a share owner of the company.

00:34:10: That's easier because we have common interests on the company

00:34:13: and all that kind of thing. So that helps a lot.

00:34:15: And besides, when we met we were working also together.

00:34:18: So we have always been working together.

00:34:20: So I think that's not an issue.

00:34:22: But with the son it's a little bit more difficult

00:34:24: because then you have so many hats to wear.

00:34:27: Sure, yeah. So when mom is putting the hat on, she's my boss and she talks the other

00:34:34: way around and not the same as my mom.

00:34:36: And I warned him when he joined the company. I said, I'm going to be really tough because

00:34:42: I demand a lot from everybody, but I'm afraid I will demand more from you since you're my

00:34:46: son. And I just have to tell you now before you start, so you know what to expect.

00:34:52: But it's a great start in the working life, right?

00:34:55: So he doesn't have to find a good job or find the right place.

00:34:59: I mean, he might have been into your work already and knew what would come in this company,

00:35:07: what you would do.

00:35:08: But is this his free will to come to this conference and talk?

00:35:13: Or are you pushing him?

00:35:14: No, he wanted to come.

00:35:16: He likes presenting too.

00:35:18: That's great.

00:35:19: was a question of mine because it's one thing that someone is presenting for his own likelihood

00:35:28: like I want to present right now, I want to try it or if it's hey go present, it will

00:35:33: be your thing, you're a good speaker, you should do that and it's okay I'm going to

00:35:38: this conference and talk and I want to make sure that he's...

00:35:42: He likes presenting, my elder son doesn't like, I don't think he would like to go and

00:35:47: but the younger one does.

00:35:48: And he was actually presenting with me last summer already.

00:35:51: We went to the EMEA leaders meeting

00:35:53: and he was presenting there about this COVID application.

00:35:56: - Ah, yeah.

00:35:58: Yeah, great.

00:35:58: So, I mean, he's learning from the best, right?

00:36:00: So, he was practicing his part of the talk with you

00:36:04: and I'm sure that this goes out very well.

00:36:09: So, he's going to present in multiple conferences

00:36:13: in future alone, I think.

00:36:14: - Yeah, I think so, yeah.

00:36:15: So the path is already made and he will be in this galaxy with us in the further thing.

00:36:24: I hope so.

00:36:25: Yeah, we'll see.

00:36:26: I'm very looking forward to see more of him in the upcoming future.

00:36:31: So I will go over to our categories.

00:36:37: Those are very often asked to our guests to have some insight of yourself.

00:36:44: So let me start with the category hypothetically.

00:36:48: So if you could undo one technological trend in the past, what would it be?

00:36:56: Undo a trend.

00:36:57: It could be everything, right?

00:36:58: It could be software, it could be hardware, it could be like cars, electric cars or something

00:37:04: like that.

00:37:06: I don't think I would undo anything.

00:37:08: Nothing to undo.

00:37:09: No, I think everything is good because if it's a mistake, you just have to fix it.

00:37:13: You don't have to undo it, just fix it.

00:37:16: Yeah, great.

00:37:17: Everyone is thinking about, "Oh, what should I say about that?"

00:37:22: But yeah, just don't undo anything.

00:37:25: Yeah, just fix it.

00:37:26: Just fix it and make it better.

00:37:27: Yeah, because one of my super skills is making fast decisions,

00:37:30: and that's the kind of same thing.

00:37:31: So I make decisions very quickly,

00:37:33: and if it's a wrong decision, I change it.

00:37:35: So I kind of--

00:37:36: You fix it before someone else just made the decision, right?

00:37:39: Yeah.

00:37:40: Okay, so if you don't want to make something unhappened or undo something,

00:37:48: what would you like to invent or create in the technology sector, which was not invented before?

00:37:55: I think what is missing from this field is probably human skills.

00:38:04: So I think that's something that we could add more.

00:38:07: So people are good with their job, they know the technology super well,

00:38:11: but they are not very good with other people.

00:38:13: So I think kind of getting more respect and understanding of other opinions

00:38:19: and that kind of things would be a good thing.

00:38:21: That's one part of many conferences and I for myself can talk for Apex Connect,

00:38:27: for example, where I'm working in the organization team.

00:38:29: We are trying to get as many soft skill presentations in as possible

00:38:34: because it's not always the most important thing to just have this

00:38:37: technological part of the conference and you learn software stuff or

00:38:41: programmatically stuff but you also learn how to deal with like hypersensical,

00:38:47: hypersensible people, how to work with other minded people maybe,

00:38:53: just work good work together and how to respect their opinions and so on.

00:38:57: That's a very great point. So additionally, do you have anything in

00:39:03: your mind for the future, what you want to deal with? Like, now you're in the ML universe,

00:39:10: universe, right? And you have so many topics we are talking about, like data modeling back

00:39:14: in the days, and then now ML. Is there any untouched topic for you, you would like to

00:39:21: learn more in future?

00:39:22: Hmm. Well, anything to do with data, that's what I love to do. So probably what I would

00:39:29: like to work on next is analytics cloud. So I have been working with it and I have my

00:39:33: certificates and everything, but I would like to learn further with that. So kind of understanding

00:39:39: data better and being able to teach people to understand the data better. So I think

00:39:45: that would be the next thing.

00:39:47: And there will be data forever, right?

00:39:49: There will be data forever.

00:39:51: It's getting more and more and everything is relying on data today. Yeah, great. Wow.

00:39:57: So I mean, additionally to that, I have another question from that, it builds up perfectly.

00:40:02: So what would you estimate?

00:40:05: What will your daily work look like in 10 years?

00:40:08: It's just training models or just revisiting or reviewing results from machine learning

00:40:15: algorithms and models?

00:40:17: I love the question because I don't even know tomorrow.

00:40:19: I mean, this is pretty good.

00:40:21: Yeah, my work is always a surprise because I work in so many areas.

00:40:25: So like now I have several projects.

00:40:27: I have an exercise migration.

00:40:29: I have a data model project.

00:40:31: I have machine learning projects, several machine learning projects.

00:40:36: I'm running the company.

00:40:38: I'm presenting.

00:40:39: I'm writing books.

00:40:42: I'm working on my PhD, so hopefully I will finish that one day.

00:40:46: So I had kind of many things that are, so what I'm doing most of the time is prioritize

00:40:51: things.

00:40:52: Yeah.

00:40:53: manage the next day. I mean we can summarize on the point that your daily

00:40:58: work in 10 years would look like the same as today but with a PhD right? Yes.

00:41:03: That's how it would be. Great. So the next category is in private. So are you

00:41:09: satisfied with your work-life balance? This might be something you already

00:41:13: told with your family structure that everyone is working in the same area

00:41:17: even in the same company right? Are you happy with the work-life balance? Can you

00:41:22: shut off, shut off from work. Like, now work is done for today. And I'm just

00:41:28: Yeah, I have a problem here because my work is my hobby. I kind of love this

00:41:34: work. You might guess the most most heard answer. Yeah, because in this

00:41:39: category, because everyone is saying that, yeah, I might refer to to Mike.

00:41:43: Mike Becker, who was like, I think the episode before the last one, he was

00:41:50: saying I don't have a work-life balance, I'm having a work-life blending.

00:41:55: It's not a balance, it's just seeing what is currently more important to me

00:42:01: and I'm seeing that how my life is basically in my work.

00:42:06: It's everything inside, it's everything blended, right?

00:42:10: So I was guessing that.

00:42:11: So if you are owning a successful company,

00:42:13: you don't have that much life between, so separately from work.

00:42:17: But if you see your work as part of your free time and life and hobby stuff,

00:42:22: that is, I think, the most sufficient thing.

00:42:26: So, another question from in private.

00:42:29: Would you show us your screen time on your iPhone without blushing?

00:42:33: So, don't say you don't have an iPhone because this answer was silly.

00:42:37: Okay. So, are you the type of person who is just constantly watching on the screen?

00:42:43: Well, I work a lot from my phone for some reason.

00:42:46: So a lot of things I do from my phone.

00:42:48: So I could say that 50% phone, 50% laptop or something like that.

00:42:54: But I, my, this phone is not peeping all the time as you can hear it.

00:42:59: There's no sound.

00:43:00: So it's been silent since many years.

00:43:04: It's not making any sounds.

00:43:05: This is completely quiet.

00:43:07: But my Apple Watch is giving me the sound.

00:43:09: It's constantly vibrating, right?

00:43:10: Yes, but not from emails.

00:43:12: So I have emails quiet.

00:43:14: So I don't get notifications for emails.

00:43:16: So if you want to reach me fast, you have to use WhatsApp signal.

00:43:20: Yeah, the messenger.

00:43:22: One of these or Twitter private message.

00:43:25: Those I get, but everything else is muted.

00:43:27: So I will read them when I have the time, but otherwise I don't.

00:43:32: Great.

00:43:33: So this leads me to the consumption part, the consumption category of my questions.

00:43:39: So yeah, I mean, you answered that already.

00:43:42: So how do you deal with the growing flood of information via various channels?

00:43:47: News and information are coming in frequently.

00:43:49: So you say that your phone is basically muted.

00:43:52: It's the same on my phone.

00:43:53: I don't even know how my ringtone is sound like.

00:43:56: I don't know.

00:43:58: So you said your messaging apps are coming to your Apple Watch and you see it quite instantly.

00:44:05: Do you have any news apps or some like breaking news and everything just to be ahead of time?

00:44:11: No.

00:44:12: news when I have the time. Usually in the mornings I start with news. I see Twitter,

00:44:17: so I have in my feed I have some news and then I might go to some newspaper pages to

00:44:21: read more about. Yeah, just to have this time to read and to update yourself and then the

00:44:26: rest of the day you will not be notified about that. That's quite healthy I think. So and

00:44:32: how do you consume news and new knowledge for example? So reading newspapers or news

00:44:39: pages but how do you consume new knowledge? Is it newsletters like Twitter, reading books

00:44:46: or magazines or? Everything. Everything. Everything. So I try to follow so many areas because I

00:44:53: don't have just one specific area that I do because I do so many things so there's a lot

00:44:59: to follow. So I either read the articles in the morning during lunch break, during coffee

00:45:03: during the breaks or I email them to myself and I read when I have time.

00:45:09: Yeah, this is a strategy I tried in the past too, but I was never reading.

00:45:15: Yes, me either.

00:45:16: And then at some point I was using the service, it was called Read It Later and then later on it was called Pocket.

00:45:24: and every browser and app you can share over the share sheet and say put it in

00:45:32: my pocket and my pocket grows and grows and grows and at some point I was just

00:45:38: reading okay you have like a thousand six hundred articles to read and I just

00:45:42: began with the oldest one I said okay I don't need that inside knowledge about

00:45:47: Oracle 11g right now because news is out there already in 23 so this is this is

00:45:54: something I tried and never worked to read something later.

00:45:58: So unless I'm not consuming it right now, it will pass.

00:46:01: And I will catch up with you as well.

00:46:02: - Yeah, that's true.

00:46:03: Yeah, it's usually better to read immediately.

00:46:05: But if it's a good article, then I email it to myself

00:46:08: and I put the title something that I can just search

00:46:11: by the title.

00:46:12: - And then you have to remember that this is something

00:46:14: you want to read and not just, okay, put it there.

00:46:17: There might be some time to read it.

00:46:18: Just, no, I will read that next week, for example.

00:46:21: Right? - Yeah.

00:46:22: - All right.

00:46:23: So the last question from the consumption category.

00:46:26: Do you turn off your phone, your Apple Watch off at night?

00:46:31: Are you always reachable?

00:46:32: I'm never reachable during the night because this is muted.

00:46:35: This phone is muted and I don't have my watch when I sleep.

00:46:40: Even as a CEO of a successful company, you're not reachable at all time, right?

00:46:44: No.

00:46:45: Great.

00:46:46: I need to sleep.

00:46:47: If I don't sleep, I will be angry and nobody wants to work with me.

00:46:50: So it's much better I sleep.

00:46:52: Okay, I would say do it like this in the future.

00:46:57: So I was really enjoying talking to you, Hilly.

00:46:59: Thank you for joining us at Devs on Tape.

00:47:02: And we will be happy for a second part maybe in the future.

00:47:06: - Absolutely. - We have so much topics

00:47:07: to talk about and it will not fit in this one recording.

00:47:11: So thank you very much and have a nice further conference.

00:47:14: - You too, thank you. - Thank you.

00:47:15: (upbeat music)

00:47:18: (upbeat music)

00:47:21: (upbeat music)

00:47:23: (upbeat music)

00:47:26: [MUSIC PLAYING]

Neuer Kommentar

Dein Name oder Pseudonym (wird öffentlich angezeigt)
Mindestens 10 Zeichen
Durch das Abschicken des Formulars stimmst du zu, dass der Wert unter "Name oder Pseudonym" gespeichert wird und öffentlich angezeigt werden kann. Wir speichern keine IP-Adressen oder andere personenbezogene Daten. Die Nutzung deines echten Namens ist freiwillig.