- 1 225
- 8 616 751
Enthought
United States
Приєднався 24 лип 2009
Enthought is a global consulting and software company that powers digital transformation for science. As the creators of the SciPy package and founders of the conference for Scientific Computing with Python, we’ve been leaders of scientific software development for over 20 years.
Our technology and deep scientific expertise enable faster discovery and continuous innovation. We solve complex problems for the most innovative and respected organizations across the life sciences, material sciences, chemical, semiconductor, and energy industries. Enthought helps companies leverage data strategy, modeling, simulation, machine learning and AI to accelerate scientific discovery and uncover new revenue opportunities through the transformation of people, processes, and technologies.
enthought.com #digitaltransformation #python #scipy #machinelearning #ai #training #software #datascience #materialsinformatics #materialsscience #bioinformatics #biopharmaceutics #semiconductor
Our technology and deep scientific expertise enable faster discovery and continuous innovation. We solve complex problems for the most innovative and respected organizations across the life sciences, material sciences, chemical, semiconductor, and energy industries. Enthought helps companies leverage data strategy, modeling, simulation, machine learning and AI to accelerate scientific discovery and uncover new revenue opportunities through the transformation of people, processes, and technologies.
enthought.com #digitaltransformation #python #scipy #machinelearning #ai #training #software #datascience #materialsinformatics #materialsscience #bioinformatics #biopharmaceutics #semiconductor
WEBINAR: Materials Informatics for Product Development: Deliver Big with Small Data
Have small data challenges but want to leverage Materials Informatics? This is a common situation for many industry labs. At Enthought, we have tackled many materials and chemicals product development challenges and have employed multiple techniques for getting the most value out of small data to meet innovation goals. In this webinar co-hosted with Chemical & Engineering News, we present proven strategies and tips for how teams can make the most of what data they have and set a course towards continuous improvement through Materials Informatics.
Contact us to learn more digital solutions for materials science and chemistry-driven R&D: www.enthought.com/materials-science-chemistry
Connect with us!
www.enthought.com
www.linkedin.com/company/enthought
enthought
Contact us to learn more digital solutions for materials science and chemistry-driven R&D: www.enthought.com/materials-science-chemistry
Connect with us!
www.enthought.com
www.linkedin.com/company/enthought
enthought
Переглядів: 747
Відео
Enthought Academy - Deep Model Evaluation Short
Переглядів 459Рік тому
Connect with us! enthought Enthought/ www.linkedin.com/company/enthought
The Future of the Science Lab is Digital
Переглядів 10 тис.Рік тому
Enthought powers digital transformation for science. For more about building your R&D lab of the future, visit www.enthought.com/lab-of-the-future. Enthought's advanced computing and deep scientific expertise enable companies to accelerate discovery and innovation in the R&D lab and better compete in today's marketplace. We specialize in transforming organizations in the pharmaceutical, biotech...
Day 1 Lightning Talks | SciPy 2022
Переглядів 1,3 тис.Рік тому
SciPy Tools Plenary Session - Day 3 | SciPy 2022
Переглядів 469Рік тому
SciPy Tools Plenary Session - Day 2 | SciPy 2022
Переглядів 504Рік тому
Q&A and Panel Maintainers Track
Переглядів 158Рік тому
UFuncs and DTypes new possibilities in NumPy | SciPy 2022
Переглядів 654Рік тому
SciPy Tools Plenary Session - Day 1 | SciPy 2022
Переглядів 856Рік тому
Introduction to Numerical Computing With NumPy - Logan Thomas | SciPy 2022
Переглядів 8 тис.Рік тому
NumPy provides Python with a powerful array processing library and an elegant syntax that is well suited to expressing computational algorithms clearly and efficiently. We'll introduce basic array syntax and array indexing, review some of the available mathematical functions in NumPy, and discuss how to write your own routines. Along the way, we'll learn just enough about Matplotlib to display ...
Monaco: Quantify Uncertainty & Sensitivities in Computational Models w/ Monte Carlo Lib | SciPy 2022
Переглядів 1,7 тис.Рік тому
Roll the dice! Quantify uncertainty and sensitivities in your existing computational models with the “monaco” Monte Carlo library. Users define input variables randomly drawn from any of SciPy's statistical distributions, run their model in parallel anywhere from 1 to millions of times, and postprocess the outputs to obtain meaningful, statistically significant conclusions. This talk will go ov...
The Myth of the Normal Curve and What to Do About It - Allan Campopiano | SciPy 2022
Переглядів 1,4 тис.Рік тому
📊 Interactive notebook app: deepnote.com/workspace/allan-campopiano-4ca00e1d-f4d4-44a2-bcfe-b2a17a031bc6/project/robust-stats-simulator-7c7b8650-9f18-4df2-80be-e84ce201a2ff//notebook.ipynb 🐍 Hypothesize (robust statistics library for Python): github.com/Alcampopiano/hypothesize ✍️ Proceedings manuscript: conference.scipy.org/proceedings/scipy2022/pdfs/allan_campopiano.pdf
Keynote: Fairness of Machine Learning in Medical Image Analysis - Enzo Ferrante | SciPy 2022
Переглядів 612Рік тому
Medical institutions around the world are adopting machine learning (ML) systems to assist in analyzing health data; at the same time, the research community of fairness in ML has shown that these systems can be biased, resulting in disparate performance for specific subpopulations. In this talk, we will discuss the relationship between bias, ML and health systems, addressing the specific case ...
Improving Random Sampling in Python- Pamphile Roy | SciPy 2022
Переглядів 955Рік тому
NumPy random number generators and SciPy distributions are widely used to get random numbers. However, challenges might arise in the following situations: (i) sampling from non-standard distributions can be slow if a custom implementation is not available and (ii) sampling in high dimensions leads to poor convergence rates. Thanks to new developments in SciPy, there is an answer to these proble...
Maintaining Fortran in Python in Perpetuity | SciPy 2022
Переглядів 666Рік тому
very useful tool. it's such a freaking pain to merge jupyter notebooks D:
*Summary* *Overall Tutorial:* * *Focus (**0:03**):* Practical application and problem-solving for numerical optimization using Python libraries. * *Libraries:* `scipy.optimize`, `estimagic`, and `jaxopt`. * *Exercises (**0:36**):* Hands-on Jupyter notebooks with examples of common optimization issues and their solutions. * *Prerequisites:* Basic Python, NumPy, and function definition knowledge. *Library Breakdown:* * *`scipy.optimize` (**0:03**):* * Simple, mature, and reliable starting point. * Provides access to 14 local optimizers suitable for various optimization problems. * Parameters are 1D NumPy arrays. * Lacks features like parallelization, interactive feedback, and flexible parameter representation. * *`estimagic` (**0:31**):* * Built on top of `scipy.optimize` and other libraries (`nlopt`, `tao`, `pygmo`, etc.), providing a harmonized interface. * Offers a wider range of optimizers and advanced features, including: * Flexible parameter representation using dictionaries, Pandas Series/DataFrames, and nested structures. * Interactive dashboard, logging, and visualization tools. * Built-in scaling and constraint handling mechanisms. * Support for global optimization techniques. * Emphasizes informed algorithm choice and robust convergence assessment. * *`jaxopt` (**0:47**):* * Utilizes JAX for automatic differentiation, JIT compilation, and GPU acceleration. * Provides differentiable optimizers, enabling gradient-based approaches with high precision and speed. * Excels in solving many instances of similar optimization problems efficiently through vectorization. *Key Concepts and Exercise Highlights:* * *Criterion Functions (**3:28**):* Defining optimization targets as Python functions. * *Start Parameters (**5:26**):* Understanding their importance and setting them appropriately. * *Algorithm Selection (**30:11**):* Choosing the right algorithm based on function properties (differentiability, complexity, constraints, size). Exercises involve identifying and fixing optimization failures by switching algorithms. * *Scaling (**40:22**):* Recognizing the impact of poorly scaled problems and using `estimagic`'s scaling capabilities to improve performance. Visualizing scaling issues with slice plots. * *Benchmarking (**53:32**):* Comparing optimizer performance on a set of benchmark problems with known optima. Utilizing profile plots and convergence plots for analysis. * *Bounds and Constraints (**1:08:48**):* Using bounds, fixed parameters, and linear constraints to define the optimization problem. Exercises involve implementing these constraints in `estimagic`. * *Automatic Differentiation (**49:59**):* Employing JAX to calculate gradients efficiently and accurately. Implementing JAX gradients within `estimagic`. * *Global Optimization (**1:29:49**):* Briefly introducing techniques like genetic algorithms, Bayesian optimization, and multi-start optimization. * *Vectorization with JAX (**2:00:19**):* Utilizing `jaxopt` and the `vmap` function transformation to solve multiple optimization problems concurrently. *Exercise Breakdown with Timestamps:* *Exercise 1: First Optimization with `scipy.optimize` (**6:57**):* * *Goal:* Familiarize yourself with basic optimization in Python using `scipy.optimize.minimize`. * *Task:* * Translate a mathematical criterion function (a function of multiple variables to be minimized) into Python code. * Set up starting parameters for the optimization. * Use `scipy.optimize.minimize` to find the minimum of the function. * *Key Takeaway:* You learn the essential steps involved in setting up and solving a basic optimization problem in Python. *Exercise 2: Convert Previous Example to `estimagic` (**13:45**):* * *Goal:* Experience the advantages of `estimagic`'s interface and features. * *Task:* * Convert the criterion function and starting parameters from Exercise 1 to work with `estimagic.minimize`. * Use dictionaries instead of flat arrays to represent parameters, taking advantage of `estimagic`'s flexibility. * Plot the optimization history using `estimagic`'s built-in plotting functions (`criterion_plot` and `params_plot`) to visualize the optimization process. * *Key Takeaway:* You become comfortable with `estimagic`'s syntax, understand how to represent parameters flexibly, and learn to use visualization tools for analyzing optimization runs. *Exercise 3: Play with Algorithm and `algo_options` (**30:26**):* * *Goal:* Develop an intuition for choosing appropriate algorithms and understanding their impact on optimization success. * *Task:* * You receive code snippets for two optimization problems, each with a pre-selected algorithm that *appears* to succeed but produces incorrect results. * Analyze the criterion functions to understand why the initial algorithm choice fails. * Choose a different algorithm (and potentially fine-tune `algo_options`) that successfully finds the true minimum. * *Key Takeaway:* You gain a deeper understanding of the strengths and weaknesses of different optimization algorithms and learn how to diagnose and address optimization failures caused by inappropriate algorithm choices. *Exercise 4: Benchmarking Optimizers (**54:53**):* * *Goal:* Learn to systematically compare optimizers and understand their relative performance on different types of problems. * *Task:* * Use `estimagic`'s benchmarking tools to run a set of benchmark problems with various optimizers. * Visualize the results using profile plots (showing the share of problems solved over the number of function evaluations) and convergence plots (detailing the convergence paths for individual problems). * Compare different implementations of the Nelder-Mead algorithm to see how implementation details can affect performance. * *Key Takeaway:* You gain experience with benchmarking optimizers, understand how to interpret benchmark results, and learn to appreciate the importance of choosing well-implemented algorithms. *Exercise 5: Constrained Optimization (**1:24:31**):* * *Goal:* Implement bounds and constraints to define a more realistic optimization problem. * *Task:* * Use `estimagic`'s constraint handling features to: * Set upper and lower bounds on specific parameters. * Fix certain parameters at their starting values. * Implement a linear constraint on the average of a subset of parameters. * Solve the constrained optimization problem and compare the results to the unconstrained case. * *Key Takeaway:* You learn to define and solve constrained optimization problems in `estimagic` and understand the impact of constraints on the solution. *Exercise 6: Scaling of Optimization Problems (timestamp not available):* * *Goal:* Visualize and address the challenges posed by poorly scaled optimization problems. * *Task:* * Work with a badly scaled benchmark problem. * Use `estimagic`'s `slice_plot` function to visualize the sensitivity of the criterion function to changes in each parameter. * Run the optimization with and without scaling (`scaling=True` in `estimagic.minimize`) and compare the results using a criterion plot. * *Key Takeaway:* You understand the concept of scaling in optimization, learn to recognize scaling issues through visualization, and experience how `estimagic`'s scaling feature can significantly improve optimizer performance. *Exercise 7: Using JAX Derivatives in `estimagic` (**1:53:58**):* * *Goal:* Integrate JAX's automatic differentiation capabilities into `estimagic` for faster and more precise gradients. * *Task:* * Translate a criterion function to use JAX arrays (`jnp`). * Compute the gradient of the function using `jax.grad` and optionally JIT-compile it for further speedup. * Solve the optimization problem using `estimagic.minimize`, passing the JAX gradient as the `derivative` argument. * *Key Takeaway:* You learn to combine the strengths of `estimagic` and JAX, demonstrating how automatic differentiation can be seamlessly integrated to enhance optimization performance. *Exercise 8: Vectorized Optimization in `jaxopt` (**2:00:19**) (Optional):* * *Goal:* Explore `jaxopt`'s capabilities for solving multiple instances of the same optimization problem concurrently. * *Task:* * Define a wrapper function that encapsulates the `jaxopt` optimization process for a single problem instance. * Use JAX's `vmap` function transformation to vectorize the wrapper function, enabling it to handle batches of problems. * Solve a set of problems with varying parameters efficiently using the vectorized solver. * *Key Takeaway:* You gain exposure to `jaxopt` and understand how to leverage JAX's vectorization features for situations where you need to solve many similar optimization problems. These exercises offer a comprehensive, hands-on approach to learning practical numerical optimization in Python, covering a wide range of topics from basic problem setup to advanced techniques using JAX and `jaxopt`. They are designed to build your intuition, problem-solving skills, and confidence in tackling real-world optimization challenges. i used gemini 1.5 pro to summarize the transcript
Awesome presentation! Please do you provide the solution of the exercises?
the mere fact that everyone and his brother is working on a way to speed up python makes it *crystal* clear that python is unusable except for throw-away prototypes or maybe a bignums calculator.
how can an external LaTex file be called/included/embedded in a JupyterLab cell?
Such a nice presentation on plotly. In one area while visual to code i am unable to see any response where fig.show('json') at 12.53.....root, data.....etc. Please help me if it is required to avail anything.
It was very helpful. Thank you.
Hi. Im having a normal python dictionary which has values a list of numbers. Is there a way to convert this dictionary into a numba typed dictionary?
please remove this NOISE; no way to hear such poor quality.
I tested cython with python version 3.11. I just created a GUI library with cython and it seems a nice language and nice bridge between C and Python. But surprisingly "ctypes" ran faster than cython. Yes! I wrote the same GUI lib in Odin & C3. Then called the functions from python with ctypes. It was 2.5 times faster than cython. Both Odin & C3 are newer languages with manual memory management. Both are aiming to be alternatives to C. Due to this performance diff, I checked my cython code again and again. I realized that type conversion takes more time in Cython. But ctypes module in CPython 311 is marvelous.
Thanks, this was fast. Little difficult to for newcomers, but was great.
how to install mayavi? ,it always shows file not responding!!!
Can you implement your own manifold with geomstats ? then customize your metric and connexion or is it not possible ?
why isn't there some kind of goddamn ISO standard already
how are the dependencies managed ?
Comecei sem entender nada, terminei sem entender nada e mais um pouco
Very good❤❤❤🎉🎉🎉🎉
Easy. Desinstall Matlab, and use Python.
can you guys provide the materials for this tutorial, the link in the description is not working
Didn't expected to understand the (i know that very very) basics of gene expression from a NumPy tutorial. Thank you.
Ahhh such a polite teacher and the way she talk abd explain. OMG she and people like her are really a gift to our society. Stay safe, keep teaching and keep smiling. thank you
Useful.
Is nobody gonna talk about how swiftly he switched from Mac to Windows
22:06 pybind11
40 degrees C, where are you?
"the posterior is what we're interested in" - brother knows what's up!
thanks for the presentation
Government using python notebooks
Potentially such an interesting talk. Do you have a link without the masks and with better sound?
great talk
How is this library installed i've really never successfully installed it in all platforms that i have used
Awesome!
This is a very impressive demonstration. I wish it was recorded at a higher resolution. Thank you for sharing.
Yup. Using this one tomorrow.
Very good explanation, Unfortunately I can not access the study material.
Tfx is useless junk😂
Best project ever! Keep on!
Would someone be able to share the Jupyter Notebooks? The link in the description is not working for me ...
well done talk, thanks! :-)
thank you for this concise explanation!
Coming from R I was like dang
2:17:20
2:01:17
52:27
1:03
Greatly simplified explanation, inspiring to enter AI for segmentation
Should have a million views..but I guess just a small percentage of our population cares about how we became
في الدقيقة 6:40 استخدم المقدم (get_feature_name()) وهذ ه الطريقة قديمة وما عادت تعمل مع اصدار sklearn الحديث بدلا عنها اسعتمل get_feature_names_out وستعمل معك باذن الله ...
Where can I get the pdf that he was showing in the class?
4h!!!
Programming takes much time