Hi Patcon, apologies for ambiguity, as stated in the post describing AR Foundation it’s 2019-08-30T15:00:00Z. I believe the S&Ts are meant to take place on the last Friday of each month.
We will be there to enjoy your presentation.
All the best,
The access details for tomorrow’s S&T:
Topic: Show and Tell - AR Foundation
Time: Aug 30, 2019 04:00 PM London
Join our Zoom Meeting
One tap mobile
+17207072699,565409637# US (Denver)
+16465588656,565409637# US (New York)
Dial by your location
+1 720 707 2699 US (Denver)
+1 646 558 8656 US (New York)
Meeting ID: 565 409 637
Looking forward to cooking something up with you tomorrow!
Recent call notes. not recorded this time, by maybe next time! Pls feel free to edit or amend how you see fit: https://hackmd.io/GfaYW383QBGbpJHcWllYWA?view
Re: about using Zoom auto-recording. I built a small pass-through app that we could fork and use. The idea is that everyone entering has a chance to understand consent to recording, and knows how to get around it if they’d prefer not (also makes it easy because the hosts doesn’t need to share credentials or show up to hit record):
And here are the instructions on setting up a Zoom call that is most accommodating: https://hackmd.io/mm28vIgZSdyOqaU-dcHCnQ
EDIT: We’d also auto-upload Zoom cloud recordings to a YouTube playlist every 30 min, using CircleCI workflows like a “public cronjob” for scheduled tasks:
Thanks to everyone who made it to the S&T, so great to see you!
The course we recently published that was an excuse to experiment with AR Foundation: https://www.packtpub.com/gb/application-development/hands-augmented-reality-video
Repo with code: https://github.com/PacktPublishing/Hands-on-Augmented-Reality-V-
The API has already changed a bit so some of the code might require tweaking.
Intro to XR
Virtuality continuum depicts a range of possible ‘flavours’ of mixed reality, spanning between the real and virtual environments.
A presentation on XR in medicine we gave earlier this year.
AR in mobile devices is possible due to SLAM (Simultaneous Localisation and Mapping), delivered by embedded sensors such as gyroscope, accelerometer, IMU and computer vision software.
Markerless AR used to rely a laser sensors like Tango, since 2017 and release of ARCore and ARKit a camera is enough… RIP Tango (open source project)
AR Foundation intro
Unity game engine launched in 2005, aiming to “democratise” game development, proprietary but free (if you make less than $100k a year).
Unity developed AR Foundation, a high level API allowing to develop and deploy the same app for Android and iOS (if the functionalities are in common e.g. only ARKit supports eye tracking at the minute).
Original AR Foundation announcement: https://blogs.unity3d.com/2018/06/15/multi-platform-handheld-ar-in-2018-part-1/
Scripting documentation: https://docs.unity3d.com/Packagesfirstname.lastname@example.org/api/UnityEngine.XR.ARFoundation.html
Samples (you can pull the repo and open it in Unity as a project) https://github.com/Unity-Technologies/arfoundation-samples
Unity Walkthrough - GIT REPO
- Download Unity 3D and necessary SDKs, NDKs (such as Java&Android) and code editors (For scripting we use JetBrains Rider but could be VS Code or any other editor really. However, only Rider & VS connect to Unity).
- Create a new Unity project and a scene, change platform in File->Build Settings (in our case Android).
- Project Settings: Auto on graphics API + disable multithreaded rendering + change package identifier + set minimum API level (24).
- Setting up scene, adding XR components like in this official introduction.
- Implement code.
- Build (preview without building to the device still a problem though Unity devs working on this)
If you were building for iOS you could just copy the project on a Mac, build the Xcode project in Unity and ship it from there.
Digital layer on top of the reality, ML1’s vision: https://www.magicleap.com/news/op-ed/magicverse
Upcoming platforms potentially more open such as Lenovo ThinkReality: https://www.lenovo.com/ww/en/solutions/thinkreality
Let us know if you AR-working on something or have ideas we can assist with!
Until the next one!
Trank you so so much! It was great!
@matt from Outlandish co-op finally confirmed his talk:
“ I will be sharing some work that we’ve done as part of a EU funded Research Project called COLA which is about orchestrating containers and VMs based on application level scaling policies (instead of resource based scaling policies). ”
We will be using the following zoom id: 8056484832
See you tomorrow at 3 PM UTC.
Was a little late getting the call, but I’ve got a pad for running notes from here on out!
Anyone is welcome to add to them if they were taking their own (and pls do feel free to haaaalp next time)
Is there a call today? I had it as a recurring event in my calendar
As you all know, on the last Friday of each month we organize the International Show&Tell. A space where technology cooperatives can share time, talking about technology while building relationships based on trust.
This month we will make an exception, since technically today is the last Friday of the month, but if we do it today it would be very close to the previous S&T and there is still almost a week left until the end of the month. So please don’t get confused, the change is only for this month!
Then, next Friday, November 1st at 3 PM UTC, we’ll do the next Show&Tell. This time Mariano Lambir from Fiqus, will present a project in which he was working together with three other developers of the cooperative. The project was developed to be presented at SpawnFest 2019, an international contest in which for two days you have to work to build an application in Elixir.
The Fiqus team developed an app to create HTML presentations from markdown. Something remarkable is that you can run live code while the presentation is going on (elixir by default, but you can run python too!). It also comes with a presenter’s view with comments and some more things that Mariano will tell us during the presentation. It’s quite simple to use, you have to have erlang and elixir installed. If you want to read something else, this is the repo: https://github.com/spawnfest/prexent
I also share you a tweet where we communicate its launch:
We look forward to seeing you!
All the best,
Breaking news : We have just been informed that the project that will be shared next Friday won the #usefulness and #completion awards in the contest!
Great! Sounds interesting.
We will be there next week
See you all next friday!
Congratulations!!! We will be there too!
Lush, well done, looking forward to Friday!
Remember, in one hour we will do the S&T.
We will use zoom to communicate: https://zoom.us/
It has a web client, but we recommend downloading the desktop client: https://zoom.us/download
The room ID we will use is: 805-648-4832
If you have any questions, please don’t hesitate to ask.
We look forward to seeing you!
Again, running notes are here: https://pad.drutopia.org/p/global-coop-show-tell
(Sorry, was late arriving this month!)
Sorry to have missed it, have you perhaps recorded the presentation of the presentation?
Currently using https://remarkjs.com/ but will check prexent out!
As you know, on the last Friday of every month we are organizing the “International Show&Tell”. A space where tech co-ops can share time and talk about technology while they continue building links.
This time the Show&Tell is going to be a little different, as we are going to talk about technology, but applied to a product!
A French cooperative called Digicoop developed a product called Kantree and based their business model around it. It’s interesting how they tell their cooperative experience and how they apply cooperative principles to their product.
Since it’s not such a technical talk, but rather a cooperative experience generated around an application of technology, a more open audience can participate, not just developers.
The zoom link: 805-648-4832
I hope you can join us tomorrow at 3PM (UTC)!