Like A Girl

Pushing the conversation on gender equality.

Code Like A Girl

13 Things I Learned at Web Summit Lisbon 2017!

My experience as a first time conference-goer: Tuesday 7 November 2017

Today is the first day of the first tech conference I have ever been to. I thought it would be a useful exercise for me and for anyone else interested to jot down my thoughts and experience of the whole thing.

Well, first of all, the ‘first day’ part isn’t strictly true. For more on why this was not really the first day and to find out a little bit about how it feels to walk into a tech conference for the first time, check out My First Day at Web Summit.

Now without further ado, I’m going to share some of what I learned today. I went to talks in conferences AutoTech/TalkRobot, Planet: tech and Creatiff (the design conference):

  1. Within 20 years, we will need 40% more fresh water than the Earth actually has — if you have any ideas of how to achieve this (or any other cool technology based way of saving the planet), contact Robert Bernard at Microsoft in the AI for Earth program as they are funding these kinds of projects:
  2. A few more crazy stats — 1 in 9 people don’t get enough to eat (which is more than I thought and makes me feel more than a little ridiculous when I think about how much I worry that I’m eating too much and worry about being fat (I’m not) — really, I should just consider myself lucky I have sufficient food and adequate nutrition). Also 60 to 65 million people are currently displaced (and therefore placed in difficulty when it comes to finding a food source), which is the most there has been since WWII. Robert Opp from the UN World Food Program outlined some tech based solutions to countering hunger, including using drones to map conflict areas and provide dropped food parcels to communities who otherwise can’t be accessed.
  3. I was also interested to learn that there is a company out there — Climeworks — which can take carbon dioxide from the atmosphere and convert it to minerals and another company, LanzaTech, which can take the waste produced by emissions in steel factories and convert it to fuel. These technologies still seem to be in the fairly early stages and apparently, we need to be able to remove 10–15 gigatonnes annually from the Earth’s atmosphere. However, the goal here is to see carbon as an opportunity rather than a liability and recycle what is normally considered a waste product.
  4. It’s predicted that the next major war will start online. Also relations between countries online tend to be somewhat less friendly that those offline (and that says something). Furthermore, the issues of fake news and propaganda spread by fake profiles (sometimes in the employ of the government) complicates these matters. Jared Cohen from Google Jigsaw spoke about the possibilities of using AI and big data to be able to track who is spreading this and why. Also he showcased a project designed to prevent radicalised teens from joining ISIS, by having the first few links of questions they might ask, like ‘How can I become a nurse for ISIS?’ redirect to pages which show them the realities of that rather than the propaganda used to enlist those teens.
Me with the hustle and bustle of Pavilion 2 in the background

5. A lot of folks are worried about AI at the moment and the possibility of machines outsmarting us, taking over the world and being ‘evil’. Max Tegmark from the Future of Life Institute talked about his vision for seeing AI come to safe fruition — namely, by adopting a ‘safety engineering’ approach which proceeds very much with caution at all stages of development and adhering strictly to the 23 ASILOMAR AI Principles. One of the main concerns is that robots will be embodying the values of humans — which aren’t necessarily ethical — and will be able to perform their functions better which is a problem when the ethics involved are murdering people for example, hence the need for an agreed code of conduct and legislation to enforce this.

6. The above was touched on in a discussion actually involving two robots — Sophia the Robot who has just been granted citizenship in Saudi Arabia (I have to admit, the mind boggles here) and Professor Einstein Robot, who, characteristic to his namesake, expressed concern that it is the human values used when building robots which will cause the problem here. I found this talk interesting to watch (and it was certainly popular, the enormous Altice Arena entirely filled up) and couldn’t quite figure out if the robots were already at the stage where they were able to develop their own answers to the questions being posed them or if the answers they had to give had been programmed in. I also admit to finding them pretty creepy but it’s hard to pinpoint why — perhaps such an obvious but not quite there yet attempt to mimic human naturalism? Something also doesn’t sit well with me with regards to robots being granted citizen status — this is an interesting ethical question and while these robots are probably already smarter in some ways than many humans, they also can’t feel and aren’t alive. Obviously this begs further questions such as does a human have to feel to be considered human? And should we change the parameters for what constitutes being ‘alive’? I don’t think there are easy answers to these questions and though my mind has been considerably opened today, I’m not sure it’s quite open enough yet to embrace the possibility of robots, however intelligent, being considered as human.

7. In the above talk, I also learned that there is what I would describe as an open source platform, SingularityNet, which AI coders can upload AI modules to and which can be used by anyone possessing AI technology — I just thought this was pretty cool.

8. There is an actual AI therapy ‘woebot’ out there, using CBT techniques, that has been proven (though only in a study of 70 people so far) to relieve the symptoms of anxiety and depression. It has to be admitted that this is a pretty wonderful use of AI, and one which has a purpose well rooted in the principles of the Copenhagen letter. I particularly love the line:

We must design tools that we would love our loved ones to use. We must question our intent

9. As speaker Sarah Ashman said (she may have been quoting here, I don’t rightly remember), it is ironic that the deeper we get into AI, the better we know ourselves and the more we find out about humanity — because we need to do this in order to be able to accurately codify it. Admittedly, I did not find a lot of what she said reassuring — in that, it seems that no one in power, either governments or tech giants is really taking responsibility for controlling and legislating AI. She encouraged us all to believe in our own agency but I have to say that I never feel I have all that much power, either as a voter or a consumer.

10. I went to what was titled a developer workshop. I thought this would be a workshop. It wasn’t. It was basically just a talk, except there was a guy live-coding at the front. Sadly, technical issues meant we couldn’t actually see what was on the screen most of the time as it kept cutting to a screensaver and also, the code ended up having a bug in it and therefore didn’t run. When we did our final project at Makers Academy, we were warned not to live code anything and sadly, this illustrated exactly why.

11. I did however learn from this talk that 70% of all mobile apps are data driven and that most people are using Dynamo DB for this. I also got a small amount (though less than I had hoped for due to afore-mentioned technical reasons) about how to build cloud-connected apps in React-Native, hosted by AWS.

12. I learned about the existence of the petabyte. I confess my ignorance that this is a number I can’t really get my head around, suffice to say that it is big and already we have technologies that generate this much data on a daily basis.

13. Blippar has released an AR city discovery map which maps street names and information onto the image shown through the camera of your phone. It’s accurate to 1 metre (GPS is normally accurate to 16–25 metres) and that is because it is using picture recognition from your camera rather than just getting the position from a satellite. I did wonder how much battery that would use — my juice runs out fast when I’m using Google Maps. That being said, I have to admit, I could really have done with something like that yesterday. The irony here is that Ambarish Mitra (CEO of Blippar) had taken a video of himself using the app in the same quarter and at the same time as I was wandering around looking for a library….it’s just a pity I didn’t bump into him!

And finally….I saw a self-driving car!

With that, talks were over for the day, so I headed out to the ‘sunset summit’, basically a venue where folks can go to network and/or unwind. I stayed for about an hour, but if I’m really honest, by this point I was knackered and just wanted to come home. I’m actually attending the conference alone and don’t feel the pressing need to network — in such cases, it’s a little tricky to find the motivation to walk up and introduce oneself to total strangers, though am sure I probably missed out on having some very interesting conversations. However, this post is long enough so perhaps I will save that thorny issue for another day!

If you want to hear more about my adventures at Web Summit….

My first day at Web Summit

Flying cars and more: 6 Cool Ideas from Web Summit 2017

Is online privacy a thing of the past? Data and security at Web Summit 2017