I’m Álvaro (but everybody call me Chapa), from gcoop. We know we are in the final hours, or minutes if we speak correctly, for this encounter, but we wanna ensure our participation. Francisco and myself are going to be present today
See you there!
Great, see you in 45 minutes in the Zoom room!
I’m Pedro from Equality and whit my coworkers Federico and Cristian we going to participate in this Show&Tell.
See you soon!
In position, just zoomed in!
Today’s Show & Tell was very nice and informative, kudos to the people of Fiqus, thanks @dcalero! we are looking forward to see next month S&T that i believe its gonna be on the hands of @sz_animorph and Animorph
Thanks everybody for participating! I hope it ignited the curiosity for Elixir ecosystem, it really worth it =]
See you the next S&T, cheers and thanks again!
Thanks to everyone who participated in S&T today, the call was a success: We were 26 co-operators from 9 different cooperatives talking about technology.
I share with you some links with interesting information, related to what was shown during the talk.
Two web apps were shown by @dcalero
- Web application for Argentina’s surgery services, a simple surgeries/surgeons/patients manager. We implemented Vue.js as front-end app and Elixir Phoenix as back-end API (json).
- Repository: https://github.com/fiqus/surgex
- Live websocket chat app that uses several Elixir/Erlang features like Phoenix LiveView and Erlang Mnesia. No front-end app needed! Everything is beatifully handled server-side by LiveView. =]
- Repository: https://github.com/fiqus/lqchatex
- Demo site: https://lqchatex.fiqus.coop/
- Elixir: https://elixir-lang.org/
- Phoenix: https://phoenixframework.org/
- Phoenix LiveView: https://github.com/phoenixframework/phoenix_live_view
- Phoenix PubSub: https://hexdocs.pm/phoenix_pubsub/
- Phoenix Presence: https://hexdocs.pm/phoenix/Phoenix.Presence.html
- Memento: https://github.com/sheharyarn/memento
- Mnesia: https://learnyousomeerlang.com/mnesia
- Gigalixir: https://gigalixir.com/
See you in the next S&T, during the last week of August.
That is awesome - sorry I could not make it today but I look forward to August! Thank you for the helpful notes and links.
Anyone can sign up for the low-volume list serv to get info on Agaric’s weekly Show and Tell that happens every Wednesday at 11am ET - https://lists.mayfirst.org/mailman/listinfo/showandtell
I encourage everyone to start a show and tell with your friends and possibly we can merge them all at some point! Sharing is awesome!
Missed this, but sounds like a rad idea! Is this the thread to track for future S&T’s, or will each one get a new thread?
You should sign up for the invites that have the URL for the chat - https://lists.mayfirst.org/mailman/listinfo/showandtell
I send them out on Tuesdays.
Just wanted to assure everyone we are holding S&T on the following Friday (sorry for late reminder, was away).
As last month, the session will begin at 4pm London time (3pm UTC).
The call details will follow soon!
Just to be sure: “following Friday” = Fri, Aug 30, right?
(Sorry, that phrase can be a bit ambiguous in my region of the world, and sometimes means “the Friday after this upcoming one”)
Hi Patcon, apologies for ambiguity, as stated in the post describing AR Foundation it’s 2019-08-30T15:00:00Z. I believe the S&Ts are meant to take place on the last Friday of each month.
We will be there to enjoy your presentation.
All the best,
The access details for tomorrow’s S&T:
Topic: Show and Tell - AR Foundation
Time: Aug 30, 2019 04:00 PM London
Join our Zoom Meeting
One tap mobile
+17207072699,565409637# US (Denver)
+16465588656,565409637# US (New York)
Dial by your location
+1 720 707 2699 US (Denver)
+1 646 558 8656 US (New York)
Meeting ID: 565 409 637
Looking forward to cooking something up with you tomorrow!
Recent call notes. not recorded this time, by maybe next time! Pls feel free to edit or amend how you see fit: https://hackmd.io/GfaYW383QBGbpJHcWllYWA?view
Re: about using Zoom auto-recording. I built a small pass-through app that we could fork and use. The idea is that everyone entering has a chance to understand consent to recording, and knows how to get around it if they’d prefer not (also makes it easy because the hosts doesn’t need to share credentials or show up to hit record):
And here are the instructions on setting up a Zoom call that is most accommodating: https://hackmd.io/mm28vIgZSdyOqaU-dcHCnQ
EDIT: We’d also auto-upload Zoom cloud recordings to a YouTube playlist every 30 min, using CircleCI workflows like a “public cronjob” for scheduled tasks:
Thanks to everyone who made it to the S&T, so great to see you!
The course we recently published that was an excuse to experiment with AR Foundation: https://www.packtpub.com/gb/application-development/hands-augmented-reality-video
Repo with code: https://github.com/PacktPublishing/Hands-on-Augmented-Reality-V-
The API has already changed a bit so some of the code might require tweaking.
Intro to XR
Virtuality continuum depicts a range of possible ‘flavours’ of mixed reality, spanning between the real and virtual environments.
A presentation on XR in medicine we gave earlier this year.
AR in mobile devices is possible due to SLAM (Simultaneous Localisation and Mapping), delivered by embedded sensors such as gyroscope, accelerometer, IMU and computer vision software.
Markerless AR used to rely a laser sensors like Tango, since 2017 and release of ARCore and ARKit a camera is enough… RIP Tango (open source project)
AR Foundation intro
Unity game engine launched in 2005, aiming to “democratise” game development, proprietary but free (if you make less than $100k a year).
Unity developed AR Foundation, a high level API allowing to develop and deploy the same app for Android and iOS (if the functionalities are in common e.g. only ARKit supports eye tracking at the minute).
Original AR Foundation announcement: https://blogs.unity3d.com/2018/06/15/multi-platform-handheld-ar-in-2018-part-1/
Scripting documentation: https://docs.unity3d.com/Packagesfirstname.lastname@example.org/api/UnityEngine.XR.ARFoundation.html
Samples (you can pull the repo and open it in Unity as a project) https://github.com/Unity-Technologies/arfoundation-samples
Unity Walkthrough - GIT REPO
- Download Unity 3D and necessary SDKs, NDKs (such as Java&Android) and code editors (For scripting we use JetBrains Rider but could be VS Code or any other editor really. However, only Rider & VS connect to Unity).
- Create a new Unity project and a scene, change platform in File->Build Settings (in our case Android).
- Project Settings: Auto on graphics API + disable multithreaded rendering + change package identifier + set minimum API level (24).
- Setting up scene, adding XR components like in this official introduction.
- Implement code.
- Build (preview without building to the device still a problem though Unity devs working on this)
If you were building for iOS you could just copy the project on a Mac, build the Xcode project in Unity and ship it from there.
Digital layer on top of the reality, ML1’s vision: https://www.magicleap.com/news/op-ed/magicverse
Upcoming platforms potentially more open such as Lenovo ThinkReality: https://www.lenovo.com/ww/en/solutions/thinkreality
Let us know if you AR-working on something or have ideas we can assist with!
Until the next one!
Trank you so so much! It was great!