On 25 January 2017 at 13:09, Marcus Shawcroft <marcus.shawcroft@gmail.com> wrote:
Hi!

The sensor.h API provides various SENSOR_CHAN_*_ANY channels. The
intent of these, and so far as I can see all current use of these ANY
channels is to represent a the tuple of all of a group of related X,
Y, Z channels.  Given that it always refers to *all* of the related
channels rather than any individual one of the related set I wonder
whether it would be less confusing if we rename then as
SENSOR_CHAN_*_ALL or SENSOR_CHAN_*_XYZ ?

Thoughts ?

The initial naming was indeed not the best. Will change it with SENSOR_CHAN_*_XYZ, as its meaning is evident.
 

The sensor attributes provide a mechanism to set an OFFSET on any
specific channel.  This would appear to be intended as a mechanism to
exploit a hardware capability provided by some devices to configure an
arbitrary measurement offset.   We have ~ 2 drivers that support this
attribute in the tree.

Given the current API, an application (or other driver user) who might
want to use the OFFSET mechanism must either:
1) Assume no driver implements OFFSET and post process results itself.
2) Hardwire knowledge that it is using the driver for a specific
device that does, or does not suport the OFFSET feature.
3) Attempt to use OFFSET, detect the ENOTSUP (or other error code used
for the same reason) and fallback to post processing results itself.

Number 1 essentially means the OFFSET mechanism serves no purpose
unless it is ubiquitous.
Number 2 undermines the device abstraction property of a platform OS
Number 3 means increased application logic and to a certain extent
also undermines the device abstraction.

From an applications /  device driver users perspective the usability
of the API would be much improved if the OFFSET attribute feature was
either mandatory for a specific channel type across all drivers (or
removed completely).   A driver for hardware with no offset feature
can easily emulate the behaviour within the driver itself.

I think we should give consideration to mandating driver support for
OFFSET attributes either for all channels or for specific channels (ie
those where we already have driver support for OFFSET).

I think this argument for consistency applies equally to other attribute types.

Thoughts ?


I am not sure if enforcing having or not having the OFFSET attribute for specific channels is necessary the best solution. The OFFSET is used as a means of manually calibrating sensors that have hardware support for such a feature. Some sensors may not need calibration, and adding this attribute to the driver may add little or no benefit to the application.

Also, extending this talk to other attribute types, the answer may not be as straightforward. SENSOR_ATTR_SAMPLING_FREQUENCY or SENSOR_ATTR_FULL_SCALE may not be relevant for some drivers (or even have a fixed value), in which case enforcing support for the attribute doesn't make sense.

The initial design of the attributes was to offer support for non-generic sensor features, and not force the drivers to implement any specific attribute if there is no need for it. So it was though that the application developers would know beforehand what sensor he will use and what attributes are supported by it.

So the current attribute model is more close to point 2) which you mentioned, or an approach similar to 3) if he wants to write a more generic application.

This talk can be extended to the fact that there is no special rule for enforcing Kconfig options, as they are not consistent across sensors of the same type.
 

The SENSOR_ATTR_CALIB_TARGET, is I believe intended as a mechanism to
allow an application to provide a driver with a "right answer" at the
current point in time, and tell it to calibrate itself such that its
results match the calib target ?

The ATTR_OFFSET mechanism and the ATTR_CALIB_TARGET mechanism have
potential to interact on a channel where both are supported, I think
we should clarify in the API how they are intended to interact.
Notably if an application issues andOFFSET followed by a CALIB_TARGET
for the same channel, we should define how the current offset affects
the behaviour of the CALIB operation.  I can imagine two alternatives:
1) CALIB has the effect of zeroing any current OFFSET.
2) Any current OFFSET is added to the CALIB target before calibration
and the current OFFSET remains in force after the CALIB operation.

I think 1) is intuitively more obvious and I can think of no benefits
in choosing 2).

Thoughts?

The point you made for 1) si correct. OFFSET and CALIB shouldn't be used together as they are a means for manual and automatic calibration, so using one will invalidate the configurations set by the other.
 

Cheers
/Marcus

Bogdan