[libre-riscv-dev] extremely busy crowdsupply update started
Jock Tanner
tanner.of.kha at gmail.com
Sun Mar 29 11:46:06 BST 2020
Hello Jean-Paul,
Thank you for such a thorough answer. I'd be happy to help without
taking too much of your time. I understand that you have a huge piece
of work at your hands. It's great that other people getting themselves
involved in Coriolis.
On Sat, 2020-03-28 at 17:44 +0100, Jean-Paul Chaput wrote:
> I'm not familiar with virtualenv and I don't understand how the
> C++ Python modules can be separated from the Python extension
> themselves.
> Breaking things in multiple components introduce complexity that
> you
> must manage afterwards. I can see a lot of troubles when people
> will
> have mismatched versions of the C++ core and the PyPI extensions.
> And lots of nigthmarish debug problems.
> Native packages would be better, and can be done, once it is
> stable
> enough.
virtualenv is a great asset for Python developer, and it strongly
influences the ways that Python software is shipped. I'll try to
elaborate.
When building native packages we usually think of Python interpreter
itself and required Python modules as system-wide dependencies with all
the version restrictions that comes out of it. When our dependencies
are few and their versions most probably matching those packaged for
the OS, it's OK. When the Python-related part of the project grows,
version mismatches becomes a hindrance.
But Python is better than that! CPython itself with all its built-in
components is totally independent of its position in the OS directory
tree. It relies on its own, mostly FHS-compliant, directory tree, which
is defined in runtime by PYTHONPATH environment variable. So
'virtualenv' and similar tools are created to manage multiple
PYTHONPATH trees on user level, switch between them, aid pip with
installing packages on them et c.
If Python package can be installed in virtual environment, it becomes
totally independent of system-wide Python packages and even the Python
interpreter itself (using pyenv). It is very convenient for developers
and users alike. That is why Python bindings are usually decoupled of
their underlying programs or libraries. For example, lxml is installed
separately of libxml2 and libxslt, but totally depends on them
installed system-wide.
Another implication is PYTHONPATH being FHS-like, that means literally
every program can be built and installed in PYTHONPATH rather than
anywhere else. For example, uwsgi is a Python application server, that
has a CPython statically compiled into itself and thus could be shipped
as a native OS package. But their creators have chosen another
approach. They made it a pip package, and as such it relies on
PYTHONPATH and supports virtualenv just for the sake of conveniency.
Their pip package, in turn, can be included in any OS repository as a
Python package by independent supporters.
To summarize. If the project consists of a native part (C/C++) and a
dynamic part (Python with C bindings), and native part can be used
without dynamic part, then it makes sense to decouple them and package
them separately: native part − as an OS package or a build script, and
dynamic part − as the corresponding dynamic language infrastructure
(pip/setuptools) suggests. It would introduce some overhead, but also
add flexibility to the process.
But if native and dynamic parts are inseparable from each other, it is
better to just ship the whole thing as a pip package.
AFAIK these approaches are prevalent in Python software development.
> Not so fast here, we need a definition of "end users". In my lab,
> the people using Coriolis are mostly ASIC people, that is
> electrical
> engineers. They use the minimal and simplest possible subset of any
> programming language.
> You may look at the relationship between C++ and SystmeC. SystemC
> is comparable to nMigen but under the form of a C++ library. They
> don't event tell them it's C++. SystemC is usually presented as a
> language on it's own (sometimes they talk about C).
> In fact, and this is a trend I've seen emerge with the rise of
> FPGA
> for some time, you have *two* kind of audience.
> =20
> * Electrical engineers or guys comming "from the hardware", with
> limited
> programming skills. They describe their design as series of
> blocks.
> They want to tightly control how and what they generate.
> * Computer Science types, or guys coming "from the software", with
> advanced programming skills. They tend to see their design in
> term
> of classes and functions, with lot of implicit mechanisms.
> Typical example is Chisel or SpinalHDL.
>
> At this point I have no clear answer about what the "best" language
> would be. But I can tell you for sure that many "hardware guys" are
> literally repelled by languages like Chisel.
> It's a discussion we are currently having at the lab. Like the
> one
> that did take place in this list a little while ago.
> =20
Both Chisel and SpinalHDL are syntactically derived from Scala. Scala
is a younger, a bit less scary sibling of Haskell. Haskell can scare
the shit out of every software engineer except those with strong
mathematical background. So I suggest it's not the best practices that
repells “hardware guys”, but a careless introduction to functional
approach in programming.
> My python interface allow to mirror in Python
> the C++ function overload, but Python only allows one set of
> parameters, is that true or I'm lagging on Python features again?
> So how to describe a function with more than one parameter set?
As a function with optional parameters.
What we have now:
class Box(object):
def __init__(self, *args, **kwargs): # real signature unknown
pass
What we could have with only positional arguments:
class Box(object):
def __init__(self, x1=None, y1=None, x2=None, y2=None):
"""
Creates a box of a given shape. If all parameters are
omitted, creates an empty box.
:param x1: starting horizontal offset,
:param x1: starting vertical offset,
:param x1:
horizontal limit,
:param x1: vertical limit,
:return:
Box object.
"""
pass
This is as neat as it can be in Python 2, but it doesn't account for
Point or Box arguments. From Python perspective, the fact that the
first positional argument can be either a single coordinate, a point,
or a box means variable reuse, and it is bad for readability. But
nethertheless we can account for that in our docstring:
class Box(object):
def __init__(self, x1=None, y1=None, x2=None, y2=None):
"""
Creates a box of a given shape. If all parameters are
omitted, creates an empty box.
:param x1: starting horizontal offset, or starting point,
or another box to copy,
:param x1: starting vertical offset, or limiting point,
:param x1: horizontal limit,
:param x1: vertical limit,
:return: Box object.
"""
pass
With Python 3.4+ we could get much better results with typing module
and Union[]. But with Python 2 without mypy that's it.
> Quite the opposite, I do think style is important. The goal of
> style
> is readability and it is mine. But I don't think you can say that
> there
> is *one* good style and all others are bad. And even the reference
> style
> may evolve... It is easy to slip from "the style I'm accustomed to"
> and
> "the right style". I understand the advantages of using a style
> shared
> by many peoples, but I won't blindly comply (I'm both French and a
> bit
> stubborn). I will try to find time to read the PEP8, but I have a
> lot
> of work to do...
> I've already seen this debate about C++ coding, and there are
> definetely good and bad styles (or lack of) but I won't say there
> is
> "the good style". Among good styles, there are different tradeoffs
> that may or may not suits you.
I can't completely agree with you on this subject, Jean-Paul. While
other languages have different style guides that evolved significantly
during their lifetimes, PEP 8 is literally the one and only Python
style guide for the last 19 years, and it has never changed, only
extended once to account for new language features. Other style guides
(Google, Khan Academy) only extend PEP 8 and other PEPs, but never
overrule them.
> Maybe it has changed in the company world, but for what I've seen
> in the ASIC world, code cleanliness is still a long way home...
> But again, it may be different company we are talking about.
> I talk of Cadence/Mentor/Synopsys, if you have the opportunity,
> take a look.
Can I really see the code you mention, or is it under NDA?
The first thing that came to mind was MyHDL [1]. I think their code is
quite clean and idiomatic.
> A last reflexion about PyCharm, it is very well for people working
> only on Python code. But for my part, I edit C++, C, Python, ReST,
> cmake, LaTeX, Yaml and whatnot, so I use Emacs (vim could also do).
> It would be inefficient to have one IDE/editor per language.
Speaking of JetBrains products, Pycharm is indeed not suited for C/C++
development. CLion, on the other hand, has a Python plugin that has all
the features of Pycharm CE, and also a first-class cmake support. I
tried CLion at a small C project, and it went perfect. But alas, CLion
is a non-free product. [2]
Now if I were tasked with setting up an environment for long-scale
hybrid C/C++/Python development I would try Eclipse with CDT and PyDev
plugins. It may lack a bit in features and usability, but it's free.
As for other formats like reST, JSON, LaTeX et c., all the IDEs handle
them well either out of the box or via plugins.
As for Emacs, I've read some posts [3] and watched some videos, and I
must say I'm not impressed. It may seem from the description that all
the plugins do a decent job, but the whole system completely lacks
consciousness, or you may call it multitasking abilities, and the
screen is very inexpressive. The editor just reacts on keystrokes, and
that's all. Like you're supposed to manually start syntax check, style
check, static analysis. Then fix problems and start again, and again.
It is really looks tedious, and boring, and error-prone.
On the contrast, when I open a file in Pycharm, it immediately starts
all the checks on it and gradually (it may take a second on a decent
machine) highlights all the real or possible problems in the
background. Critical things are bright red. Syntax errors.
Uninitialized variables. Unresolved imports. Non-criticals are pale
yellow or greyish. Call arguments not matching callee's signatures.
Variables that may be uninitialized. Unused code. Style problems. Human
language syntax errors.
I can scroll, type, switch between files, and still I see all the
results of this background work simultaneously.
Eclipse's PyDev can also do static check while you type [4], but the
list of possible warnings seems shorter than Pycharm's, and the result
is less esthetically pleasing for my taste.
So, Jean-Paul, I would really like you to try either IDE on a real code
and tell me what you think.
[1] https://github.com/myhdl/myhdl/
[2]
https://intellij-support.jetbrains.com/hc/en-us/community/posts/115000477130-How-to-handle-a-mixed-Python-C-C-project
[3] https://www.fullstackpython.com/emacs.html
[4] https://www.pydev.org/manual_adv_code_analysis.html
More information about the libre-riscv-dev
mailing list