All Activity
- Last week
-
stayDeveloper joined the community
- Earlier
-
disidentified_soul joined the community
-
harsh started following Whats the best way to start with learning Multimodal models?
-
I have been wanting to dive into Multimodal AI but cannot find starting point. Would love to have communities opinion on this.
-
harsh joined the community
-
https://www.livemint.com/ai/a-decisive-ai-breakthrough-is-about-to-transform-the-world-11731755085418.html
-
Confirming EB-1 country-wise limits and resultant backlogs
Aman replied to Koonaal's topic in Profile Prep
This is correct. -
jayanth joined the community
-
nyoomasAbarf joined the community
-
vibbsxAbarf joined the community
-
mikepreor joined the community
-
DayDay joined the community
-
Is it correct that there is a country-wise cap on the EB-1 applications considered (which results in delays in processing newer applications from countries from where there is relatively more interest)?
-
Google is hosting a comprehensive course on Generative AI from November 11th. Everyone interested in participating needs a kaggle account and a google account. These are the topics overview: Day 1: Foundational Models & Prompt Engineering - Explore the evolution of LLMs, from transformers to techniques like fine-tuning and inference acceleration. Get trained with the art of prompt engineering for optimal LLM interaction. Day 2: Embeddings and Vector Stores/Databases - Learn about the conceptual underpinning of embeddings and vector databases, including embedding methods, vector search algorithms, and real-world applications with LLMs, as well as their tradeoffs. Day 3: Generative AI Agents - Learn to build sophisticated AI agents by understanding their core components and the iterative development process. Day 4: Domain-Specific LLMs - Delve into the creation and application of specialized LLMs like SecLM and Med-PaLM, with insights from the researchers who built them. Day 5: MLOps for Generative AI - Discover how to adapt MLOps practices for Generative AI and leverage Vertex AI's tools for foundation models and generative AI applications. Register here: https://rsvp.withgoogle.com/events/google-generative-ai-intensive (youβll receive a badge on your Kaggle profile upon course completion!)
-
Ritu joined the community
-
-
Emiliano Volpi started following subin_vidhu
-
Emiliano Volpi started following Aman
-
Thanks for sharing, @subin_vidhu! This post is better fit for our AI News section so I'll moving it there. The announcements section is intended for our community's announcements.
- 1 reply
-
- 1
-
Welcome, @Anirudh! Glad to have you onboard and look forward to your posts on NeuralNets π
-
Welcome, Anirudh!
-
Hello Everyone, I'm Anirudh π an MS in Data Science student from Indiana University, Bloomington. I'm very passionate about AI and I am currently working on the Deep Learning Specialization by DeepLearning.ai and I actively use Aman's website for the notes. I found this community through Aman's LinkedIn post and was hoping to connect with people to learn and get my foot into the AI industry. Excited to be a part of this community. An
-
Hello everyone, Hope you're all having a fun time learning and applying AI. I recently completed the first 3 courses in the Deep Learning Specialization by DeepLearning.ai. In order to start applying what I learned, I started by applying the Iris dataset on a shallow neural network that I hard coded. Now I want to hard code a deep neural network but I'm not sure what is a good dataset to use for this purpose. I'm looking to build a neural network with at least 5 hidden layers and just apply a dataset to it for the purpose of just learning and get a good intuition of how DNNs work by hard coding. Basically like a "hello world" version of a dataset for a Deep Neural Network. I was hoping ask this here to get some suggestions. If anyone of you know of any such datasets, request you to please let me know. Thank you for your time. Thanks, An
-
- dataset
- deep neural network
-
(and 2 more)
Tagged with:
-
A short summary from the blog my Meta Llama 3.2 Models: Meta is releasing new Llama 3.2 models, including small and medium-sized vision LLMs (11B and 90B) and lightweight, text-only models (1B and 3B) that fit onto edge and mobile devices, with pre-trained and instruction-tuned versions available. Llama Stack Distributions: The company is also introducing Llama Stack distributions, a standardized interface for customizing Llama models and building agentic applications, with a simplified and consistent experience for developers across multiple environments. Enhanced Safety Features: Llama 3.2 includes new updates to the Llama Guard family of safeguards, designed to support responsible innovation and empower developers to build safe and responsible systems, with optimized Llama Guard models for on-device deployment.
-
Hope this will help someone learning about Embedded ML.
-
https://sites.google.com/g.harvard.edu/tinyml/home https://github.com/Mjrovai/UNIFEI-IESTI01-TinyML-2023.1?tab=readme-ov-file https://tinyml.seas.harvard.edu/courses/
- 1 reply
-
- 1
-
Welcome, @Pranav Kumar! Glad to have you onboard and look forward to your posts on NeuralNets π
-
Welcome, @Nikunj Kotecha! Glad to have you onboard and look forward to your posts on NeuralNets π
-
I am Pranav Kumar , currently 2nd year undergraduate at IIT Madras. i am from India. I am a highly passionate individual with a deep interest in Machine Learning, Deep Learning, and Artificial Intelligence . I am always excited to collaborate with like-minded individuals and teams in the fields of AI. If you're passionate about solving real-world challenges using cutting-edge technologies, let's connect and explore opportunities to create impactful solutions together! . Thanks everyone and specially Aman Chadha Sir.
- 1 reply
-
- 1
-
Pranav Kumar started following Introduce Yourself
-
A new series of AI models, OpenAI o1, has been released in preview. These models are designed to spend more time thinking before responding, enabling them to reason through complex tasks and solve harder problems in science, coding, and math. Do check out their blog for more here "For example, o1 can be used by healthcare researchers to annotate cell sequencing data, by physicists to generate complicated mathematical formulas needed for quantum optics, and by developers in all fields to build and execute multi-step workflows." (Quote from the official OpenAI blog post) If anyone in the group has already tried using o1 for annotating cell sequencing data or similar use cases, I'd love to hear about your experience and any insights you can share!
-
Thanks, Aman. π Hello, All! I'm pursuing academics in AI and Data Science under incumbent faculties of Harvard University and University of Pennsylvania. I blog at the Confluence ( www.kunalsconduit.wordpress.com ) I look forward to newer learnings from and contributing towards neuralnets.ai
- 1 reply
-
- 2