Conan 1.21: Faster parallel uploads, Intel compiler support, improved Python requires and matching names in generators for the upstream package!
We are saying goodbye to 2019 with 1.21 release that comes with lots of cool features, bugfixes, and contributions from the community. Let’s see some of those.
Use multiple threads for uploading Conan packages to remotes
After many users requested this feature, we have been preparing it since Conan 1.19. First, we moved
to a progress bar system that supports concurrency.
After that, we did a whole refactor of the
console output. Now, since Conan 1.21, it will be possible to upload Conan packages to
remotes faster thanks to the use of multiple threads. To activate this feature you just have to add
the --parallel
argument to the conan upload
command:
$ conan upload "*" --confirm --parallel --all -r my_remote
There are two levels of parallelization. First, all of the references are uploaded in parallel. Then, for each of those references, all of its binary packages are uploaded in parallel too. The maximum level of total simultaneous threads used for uploading is 8. Depending on the configuration and the number of uploaded packages, an increase in speed for around 400% can be achieved.
Intel compiler support
Now Conan supports the Intel compiler. This compiler has the peculiarity of using Visual Studio
in Windows and gcc
in Linux as base compilers. This is how the new entry looks in the settings.yml:
intel:
version: ["11", "12", "13", "14", "15", "16", "17", "18", "19"]
base:
gcc:
<<: *gcc
threads: [None]
exception: [None]
Visual Studio:
<<: *visual_studio
To manage the compatibility between the
packages generated with intel
and the generated with the base compiler you can use the
compatible_packages
feature that provides base_compatible()
and parent_compatible(compiler="compiler",
version="version")
functions to define compatibility between packages.
For example, defining a package_id
this way:
def package_id(self):
if self.settings.compiler == "intel":
p = self.info.clone()
p.base_compatible()
self.compatible_packages.append(p)
Would make Conan resolve the Visual Studio
or gcc
package in case there is no package
generated by the intel
compiler.
The opposite would also be possible using the
parent_compatible
function to fall back to Intel packages in case the Visual Studio
one is not present:
def package_id(self):
if self.settings.compiler == "Visual Studio":
compatible_pkg = self.info.clone()
compatible_pkg.parent_compatible(compiler="intel", version=16)
self.compatible_packages.append(compatible_pkg)
Improved Python requires
In the 1.7 release, we introduced the Python requires feature to share Python code between different recipes.
For this release, we have a new improved python_requires
that solves some drawbacks from the old one.
These are the main features of the new implementation:
-
Class attribute. The syntax declares a class attribute instead of a module function call so that recipes are cleaner.
-
PackageID modification. Now
python_requires
will affect the consumers package_id.
As we said, now the syntax is easier and more aligned with the rest of the recipe syntax. With the previous version if you wanted to reuse methods from a base class you had to do something like this:
from conans import ConanFile, python_requires
base = python_requires("pyreq/version")
class ConsumerConan(base.get_conanfile()):
name = "consumer"
version = base.get_version()
And with the new syntax it looks like this:
from conans import ConanFile
class Pkg(ConanFile):
python_requires = "pyreq/version"
python_requires_extend = "pyreq.MyBase"
The version of the python_requires
will now affect the package ID of the packages that use them with
a minor_mode
policy. That means that if you change the minor or major components of the version, it
will generate a new package ID but the patch component will not affect the ID. Learn more about the
new implementation in the Conan
documentation.
Use different names by generator
As you probably know, the cpp_info
attribute from the conanfile stores all the information
needed by consumers of a package like include directories or library names and paths. Since 1.19, we
introduced a new attribute for this object called name so if you set cpp_info.name
, that name should
be used by some supported generators to create file or variable names instead of using the regular
package name.
Now, in the 1.21 release, we extend this
feature
by using cpp_info.names["generator_name"]
so you can specify this same name per generator.
If you want to use CMake
and pkg_config
generators for the same recipe different names can be set
for each of those.
Let’s see an example where you have a conanfile.py with this package_info
for the package Mylib/0.1
:
def package_info(self):
self.cpp_info.names["cmake"] = "MyLib"
self.cpp_info.names["pkg_config"] = "my_lib"
If this package is installed using the CMake
generator a target with the name CONAN_PKG::MyLib
will be created. If you install the package using the pkg_config
generator a file my_lib.pc
will
be generated with a library name my_lib
inside.
Other cool things
- Set the logging
level
to Conan using a name instead of a number, which is much more intuitive. The available logging
levels are:
critical
,error
,warning
,info
anddebug
. - Use
tools.check_min_cppstd()
andtools.valid_min_cppstd()
to check if the cppstd version is valid for a specific package. - Use
fuzz
parameter intools.patch()
function to accept fuzzy patches.
Have a look at the full list of features and fixes in the changelog.
Report any bug or share your feedback opening a new issue in our issue tracker, don’t forget to update and merry Christmas!