Jump to content

All Activity

This stream auto-updates

  1. Last week
  2. Earlier
  3. I have been wanting to dive into Multimodal AI but cannot find starting point. Would love to have communities opinion on this.
  4. https://www.livemint.com/ai/a-decisive-ai-breakthrough-is-about-to-transform-the-world-11731755085418.html
  5. Is it correct that there is a country-wise cap on the EB-1 applications considered (which results in delays in processing newer applications from countries from where there is relatively more interest)?
  6. Google is hosting a comprehensive course on Generative AI from November 11th. Everyone interested in participating needs a kaggle account and a google account. These are the topics overview: Day 1: Foundational Models & Prompt Engineering - Explore the evolution of LLMs, from transformers to techniques like fine-tuning and inference acceleration. Get trained with the art of prompt engineering for optimal LLM interaction. Day 2: Embeddings and Vector Stores/Databases - Learn about the conceptual underpinning of embeddings and vector databases, including embedding methods, vector search algorithms, and real-world applications with LLMs, as well as their tradeoffs. Day 3: Generative AI Agents - Learn to build sophisticated AI agents by understanding their core components and the iterative development process. Day 4: Domain-Specific LLMs - Delve into the creation and application of specialized LLMs like SecLM and Med-PaLM, with insights from the researchers who built them. Day 5: MLOps for Generative AI - Discover how to adapt MLOps practices for Generative AI and leverage Vertex AI's tools for foundation models and generative AI applications. Register here: https://rsvp.withgoogle.com/events/google-generative-ai-intensive (you’ll receive a badge on your Kaggle profile upon course completion!)
  7. ChatGPT has launched a new web search feature that provides fast and accurate answers with links to relevant sources. This feature considers the context of your conversation to give you the best possible answer. Try it out now at https://chatgpt.com/?hints=search - I have tried one:
  8. Thanks for sharing, @subin_vidhu! This post is better fit for our AI News section so I'll moving it there. The announcements section is intended for our community's announcements.
  9. Welcome, @Anirudh! Glad to have you onboard and look forward to your posts on NeuralNets πŸ™‚
  10. Hello Everyone, I'm Anirudh πŸ™‚ an MS in Data Science student from Indiana University, Bloomington. I'm very passionate about AI and I am currently working on the Deep Learning Specialization by DeepLearning.ai and I actively use Aman's website for the notes. I found this community through Aman's LinkedIn post and was hoping to connect with people to learn and get my foot into the AI industry. Excited to be a part of this community. An
  11. Hello everyone, Hope you're all having a fun time learning and applying AI. I recently completed the first 3 courses in the Deep Learning Specialization by DeepLearning.ai. In order to start applying what I learned, I started by applying the Iris dataset on a shallow neural network that I hard coded. Now I want to hard code a deep neural network but I'm not sure what is a good dataset to use for this purpose. I'm looking to build a neural network with at least 5 hidden layers and just apply a dataset to it for the purpose of just learning and get a good intuition of how DNNs work by hard coding. Basically like a "hello world" version of a dataset for a Deep Neural Network. I was hoping ask this here to get some suggestions. If anyone of you know of any such datasets, request you to please let me know. Thank you for your time. Thanks, An
  12. A short summary from the blog my Meta Llama 3.2 Models: Meta is releasing new Llama 3.2 models, including small and medium-sized vision LLMs (11B and 90B) and lightweight, text-only models (1B and 3B) that fit onto edge and mobile devices, with pre-trained and instruction-tuned versions available. Llama Stack Distributions: The company is also introducing Llama Stack distributions, a standardized interface for customizing Llama models and building agentic applications, with a simplified and consistent experience for developers across multiple environments. Enhanced Safety Features: Llama 3.2 includes new updates to the Llama Guard family of safeguards, designed to support responsible innovation and empower developers to build safe and responsible systems, with optimized Llama Guard models for on-device deployment.
  13. https://www.businesstoday.in/magazine/deep-dive/story/ai-evs-and-semiconductors-the-tech-trio-shaping-indias-blueprint-for-2047-445455-2024-09-11
  14. Hope this will help someone learning about Embedded ML.
  15. https://sites.google.com/g.harvard.edu/tinyml/home https://github.com/Mjrovai/UNIFEI-IESTI01-TinyML-2023.1?tab=readme-ov-file https://tinyml.seas.harvard.edu/courses/
  16. Welcome, @Pranav Kumar! Glad to have you onboard and look forward to your posts on NeuralNets πŸ™‚
  17. Welcome, @Nikunj Kotecha! Glad to have you onboard and look forward to your posts on NeuralNets πŸ™‚
  18. I am Pranav Kumar , currently 2nd year undergraduate at IIT Madras. i am from India. I am a highly passionate individual with a deep interest in Machine Learning, Deep Learning, and Artificial Intelligence . I am always excited to collaborate with like-minded individuals and teams in the fields of AI. If you're passionate about solving real-world challenges using cutting-edge technologies, let's connect and explore opportunities to create impactful solutions together! . Thanks everyone and specially Aman Chadha Sir.
  19. A new series of AI models, OpenAI o1, has been released in preview. These models are designed to spend more time thinking before responding, enabling them to reason through complex tasks and solve harder problems in science, coding, and math. Do check out their blog for more here "For example, o1 can be used by healthcare researchers to annotate cell sequencing data, by physicists to generate complicated mathematical formulas needed for quantum optics, and by developers in all fields to build and execute multi-step workflows." (Quote from the official OpenAI blog post) If anyone in the group has already tried using o1 for annotating cell sequencing data or similar use cases, I'd love to hear about your experience and any insights you can share!
  20. Thanks, Aman. 😊 Hello, All! I'm pursuing academics in AI and Data Science under incumbent faculties of Harvard University and University of Pennsylvania. I blog at the Confluence ( www.kunalsconduit.wordpress.com ) I look forward to newer learnings from and contributing towards neuralnets.ai
  1. Load more activity
×
×
  • Create New...