What is the relation between BLAS, LAPACK and ATLAS

43,754

Solution 1

BLAS is a collection of low-level matrix and vector arithmetic operations (“multiply a vector by a scalar”, “multiply two matrices and add to a third matrix”, etc ...).

LAPACK is a collection of higher-level linear algebra operations. Things like matrix factorizations (LU, LLt, QR, SVD, Schur, etc) that are used to do things like “find the eigenvalues of a matrix”, or “find the singular values of a matrix”, or “solve a linear system”. LAPACK is built on top of the BLAS; many users of LAPACK only use the LAPACK interfaces and never need to be aware of the BLAS at all. LAPACK is generally compiled separately from the BLAS, and can use whatever highly-optimized BLAS implementation you have available.

ATLAS is a portable reasonably good implementation of the BLAS interfaces, that also implements a few of the most commonly used LAPACK operations.

What “you should use” depends somewhat on details of what you’re trying to do and what platform you’re using. You won't go too far wrong with “use ATLAS + LAPACK”, however.

Solution 2

While ago, when I started doing some linear algebra in C, it came to me as a surprise to see there are so few tutorials for BLAS, LAPACK and other fundamental APIs, despite the fact that they are somehow the cornerstones of many other libraries. For that reason I started collecting all the examples/tutorials I could find all over the internet for BLAS, CBLAS, LAPACK, CLAPACK, LAPACKE, ATLAS, OpenBLAS ... in this Github repo.

Well, I should warn you that as a mechanical engineer I have little experience in managing such a git repository or GitHub. It will first seem as a complete mess to you guys. However if you manage to get over the messy structure you will find all kind of examples and instructions which might be a help. I have tried most of them, to be sure they compile. And the ones which do not compile I have mentioned. I have modified many of them to be compilable with GNU compilers (gcc, g++ and gfortran). I have made MakeFiles which you can read to learn how you can call individual Fortran/FORTRAN routines in a C or C++ program. I have also put some installations instructions for mac and linux (sorry windows guys!). I have also made some bash .sh files for automatic compilation of some of these libraries.

But going to your other question: BLAS and LAPACK are rather APIs not specific SDKs. They are just a list of specifications or language extensions rather than an implementations or libraries. With that said, there are original implementations by Netlib in FORTRAN 77, which most people refer to (confusingly!) when talking about BLAS and LAPACK. So if you see a lot of strange things when using these APIs is because you were actually calling FORTRAN routines in C rather than C libraries and functions. ATLAS and OpenBLAS are some of the best implementations of BLAS and LACPACK as far as I know. They conform to the original API, even though, to my knowledge they are implemented on C/C++ from scratch (not sure!). There are GPGPU implementations of the APIs using OpenCL: CLBlast, clBLAS, clMAGMA, ArrayFire and ViennaCL to mention some. There are also vendor specific implementations optimized for specific hardware or platform, which I strongly discourage anybody to use them.

My recommendation to anyone who wants to learn using BLAS and LAPACK in C is to learn FORTRAN-C mixed programming first. The first chapter of the mentioned repo is dedicated to this matter and there I have collected many different examples.

P.S. I have been working on the dev branch of the repository time to time. It seems slightly less messy!

Solution 3

ATLAS is by now quite outdated. It was developed at a time when it was thought that optimizing the BLAS for various platforms was beyond the ability of humans, and as a result autogeneration and autotuning was the way to go.

In the early 2000s, along came Kazushige Goto, who showed how highly efficient implementations can be coded by hand. You may enjoy an interesting article in the New York Times: https://www.nytimes.com/2005/11/28/technology/writing-the-fastest-code-by-hand-for-fun-a-human-computer-keeps.html.

Kazushige's on the one hand had better insights into the theory behind high-performance implementations of matrix-matrix multiplication, and on the other hand engineered these better. His approach, which on current CPUs is usually the highest performing, is not in the search space that ATLAS autotunes. Hence, ATLAS is inherently inferior. Kazushige's implementation of the BLAS became known as the GotoBLAS. It was forked as the OpenBLAS when he joined industry.

The ideas behind the GotoBLAS were refactored into a new implementation, the BLAS-like Library Instantiation Software (BLIS) framework (https://github.com/flame/blis), which implements the same algorithms, but structures the code so that less needs to be custom implemented for a new architecture. BLIS is coded in C.

What this discussion shows is that there are many implementation of the BLAS. The BLAS themselves are a de facto standard for the interface. ATLAS was once the state of the art. It is no longer.

Share:
43,754

Related videos on Youtube

makhlaghi
Author by

makhlaghi

Updated on January 25, 2020

Comments

  • makhlaghi
    makhlaghi over 4 years

    I don't understand how BLAS, LAPACK and ATLAS are related and how I should use them together! I have been looking through all of their manuals and I have a general idea of BLAS and LAPACK and how to use them with the very few examples I find, but I can't find any actual examples using ATLAS to see how it is related with these two.

    I am trying to do some low level work on matrixes and my primary language is C. First I wanted to use GSL, but it says that if you want the best performance you should use BLAS and ATLAS. Is there any good webpage giving some nice examples of how to use these (in C) all together? In other words I am looking for a tutorial on using these three (or any subset of them!). In short I am confused!

  • makhlaghi
    makhlaghi almost 11 years
    Thanks for the explanations. Do you know any examples of HOW to use ATLAS+LAPACK? I need to see a few examples to understand how to use these! I understand what they are for and the theory of what they do, but I can hardly find any examples in C on how to implement in practice.
  • Stephen Canon
    Stephen Canon almost 11 years
    @astroboy: can you give me some information about what you’re actually trying to do? LAPACK especially is an enormous library.
  • makhlaghi
    makhlaghi almost 11 years
    For simplicity, lets say I have a matrix and I want to multiply it by a certain value. How can I do this combining ATLAS and (LAPACK or BLAS) in C? I just want to see how to implement any of these functions. There are a few examples in netlib.org/lapack/lapacke.html but there no mention of ATLAS!
  • Noah_S
    Noah_S almost 7 years
    The Wikipedia page for LAPACK starts with "LAPACK (Linear Algebra Package) is a standard software library". Are you saying that's incorrect, since it's an API specification and not an implementation?
  • Noah_S
    Noah_S almost 7 years
    After some research it seems LAPACK and BLAS actually are implementations. From NetLib's faq on BLAS: "The BLAS (Basic Linear Algebra Subprograms) are routines that provide standard building blocks for performing basic vector and matrix operations". From LAPACK's GitHub page: "LAPACK is a library of Fortran subroutines". Based on that, and on reading through LAPACK's GitHub project, my impression is that BLAS and LAPACK actually are implementations--LAPACK builds on BLAS to provide more sophisticated functionality.
  • Foad S. Farimani
    Foad S. Farimani almost 7 years
    @Noah_S I wouldn't used Wikipedia as "the" reference, but to my limited knowledge, there are several implementations of LAPACK. I think calling it an API is more accurate now. but please correct me if I'm wrong.
  • Andrew Janke
    Andrew Janke over 5 years
    I think part of the confusion is that BLAS is an API/specification, but there is also a "Reference Implementation" of BLAS (from Netlib) that is also referred to as just the "BLAS library". Usually when people say BLAS they mean the API, because the Reference Implementation is non-optimized, so it's not used much in practice/industry. ATLAS provides an optimized implementation of a few of the LAPACK subroutines, and then optionally pulls in the rest of them from LAPACK itself to produce a complete LAPACK implementation in the built ATLAS lib files.
  • Minh Nghĩa
    Minh Nghĩa over 4 years
    Is BLAS an interface? Like, all BLAS implementations must adhere to one BLAS specifications, so that they can be used interchangeably? From what I read, there's no official BLAS interface, just implementation mimics each other.
  • Stephen Canon
    Stephen Canon over 4 years
    @MinhNghĩa: There is a standard, netlib.org/blas/blast-forum, but (a) there's no enforcement--a standard without a test suite isn't really a standard--and (b) I don't think that anyone implements the full set of interfaces described by that document. It's, uh, aspirational.
  • Andrey
    Andrey about 4 years
    @Noah_S There is no contradiction, each library has an API, which makes it possible to reimplement the actual functionality while staying API compatible and that is what happened with LAPACK.
  • thistleknot
    thistleknot over 3 years
    say I have libnvblas.so linked in R via ld_preload. Do I need to be concerned about using the standard lapack with it? For example. I've read CULA (which is outdated by magma) has an lapack that works with nvidia. Epyc has libflame. Albeit I can't seem to get libflame to work with R. But do I need to be concerned? It seems like (from what your saying). LApack will use whatever BLAS library I have under the hood. So why even both with a different lapack?