student-news”>


Unequal tools hackathon and learnathon—the importance is solidly on learning—the annual DevFest took place final month with 1300 participants attending. One of Columbia’s largest tech events and hosted annually by ADI (Application Development Initiative), DevFest is a week of classes, mini-lectures, workshops, and unite tech talks, surfaced off by an 18-hour hackathon—all interspersed with socializing, meetups, giveaway food, and fun events.
Open to any tyro from Columbia College, Columbia Engineering, General Studies, and Barnard, DevFest had something for everyone.
Beginners, even those with 0 coding experience, could take rudimentary classes in Python, HMTL, JavaScript. Those already coding had a eventuality to enhance their programming believe by micro-lectures on iOS, information science, Civic Data and by workshops on geospatial visualization, UI/UX, web development, Chrome extensions, and more. DevFest unite Google gave tech talks on Tensorflow and Google Cloud Platform, with Google engineers onsite giving hands-on instruction and profitable advice.
Every evening, DevSpace offering self-paced, online tutorials to beam DevFest participants by a stairs of building a entirely functioning project. Four marks were offered: Beginner Development (where participants set adult a operative website), Web Development, iOS Development, Data Science. On palm to yield some-more assistance were TAs, mentors, and peers.
This importance on training within a understanding village finished for a different organisation and not a common hackathon mix: 60% were attending their initial hackathon, 50% were women, and 25% identified as people of color.
DevFest events were kicked off on Monday, Feb 12, with talks by Computer Science highbrow Lydia Chilton (an HCI researcher) and Jenn Schiffer, a pixel pattern artist and tech satirist; events resolved with an 18-hour hackathon starting Saturday dusk and stability by Sunday.
Ends with a hackathon
Thirty teams competed in a DevFest hackathon. Limited to a few members only, teams possibly arrived processed or coalesced during a team-forming eventuality where students pitched ideas to attract others with a indispensable skills.
The $1500 initial place went to Eyes and Ears, a video use directed during creation information within video some-more widely accessible, both by translating it for those who don’t pronounce a denunciation in a video, and by providing audio descriptions of a scene’s calm for people with visible impairments. The aim was to erase barriers preventing people from accessing a augmenting amounts of information resources permitted in video. Someone regulating a use simply chooses a video, selects a language, and afterwards waits for an email delivering a translated video finish with audio stage descriptions.
Eyes and Ears is a plan of 4 mechanism scholarship MS students—Siddhant Somani, Ishan Jain, Shubham Singhal, Amit Bhat—who came to DevFest with a vigilant to contest as a group in a hackathon. The thought for a plan came after they attended Google’s Cloud workshop, training there about an array of Google APIs. The doubt afterwards became anticipating a approach to mix APIs to emanate something with an impact for good.
The initial thought was to “simply” interpret a video from any denunciation to any other denunciation upheld by Google Translate (roughly 80% of a world’s languages). However, carrying built a interpretation pipeline, a group satisfied a tube could be extended to embody audio descriptions of a video’s visible scenes, both when a stage changes or per a user’s request.
That such a use is even possible—let alone buildable in 18 hours—is due to a energy of APIs to perform formidable record tasks.
Eyes and Ears: An end-to-end tube to make information in video some-more permitted by translations and audio stage descriptions.
It’s in a spaces between a many APIs and a consecutive sequence of those APIs that compulsory engineering effort. The group had to shepherd a outlay of one API to a outlay of another, sync pauses to a audio (which compulsory an algorithm for detecting a start and stop of speech), sync a gait of one denunciation to a gait of a other denunciation (taking into comment a incompatible series of words). Because Google Video Intelligence API, that is designed for indexing video only, outputs meagre singular difference (mostly nouns like “car” or “dog”), a group had to erect full, semantically scold sentences. All in 18 hours.
The plan warranted regard from Google engineers for a talented and desirous use of APIs. In further to a initial place prize, Eyes and Ears was named best use of Google Cloud API.
The group will demeanour to continue work on Eyes and Ears in destiny hackathons.
The $1000 second-place esteem went to Nagish (Hebrew for “accessible”), a height that creates it elementary for people with conference or vocalization problems to make and accept phone calls regulating their intelligent phone. For incoming and effusive calls, Nagish translates content to debate and debate to content in genuine time so voice phone calls can be review or generated around Facebook Messenger. All conversions are finished in real-time for seamless and healthy phone conversations.
The Nagish team— mechanism scholarship majors Ori Aboodi, Roy Prigat, Ben Arbib, Tomer Aharoni, Alon Ezer—are 5 veterans who were encouraged to assistance associate veterans as good as others with conference and debate impairments.
To do so compulsory a sincerely formidable sourroundings finished adult of several APIs (Google’s text-to-speech and speech-to-text as good as a Twilio API for generating phone numbers for any user and retrieving a mp3 files of a calls), all finished to “talk” to one another over tradition Python programs. Additionally, a group combined a chatbot to bond Nagish to a Facebook platform.
For providing a indispensable use to those with conference and debate impairments, a group won a Best Hack for Social Good.
Of course, intensity users don’t have to be hearing- or speech-impaired to conclude how Nagish creates it probable to unobtrusively take an important, or not so important, phone call during a assembly or maybe even during class.
With Nagish uploaded to a intelligent phone, Facebook Messenger becomes a height for creation and receiving wordless phone calls around speech-text conversion.
Taking a $750 third-place esteem was Three a Day, a height that matches restaurants or people wanting to present food with those in need of food donations. The idea is creation certain each particular gets 3 dishes a day. The two-person group (computer scholarship majors Kanishk Vashisht and Sambhav Anand) built Three a Day regulating Firebase as a database and React as a front end, with a back-end computing granted essentially by Google Cloud functions. A Digital Ocean server runs Cron jobs to report a relating of restaurants and charities. The group also won for a best use of Digital Ocean products.
Posted Mar 22, 2018
– Linda Crane