In the attachement you can find such an example file. Thanks a lot for your support!
If the files come from PicoView please provide some example (directly here using Add attachments if the data are not secret) and I will add support for the MI file import module.
Thanks for the reply. Yes the AFM files are generated by PicoView. I can save the data there as a .mi file, where I have no problems to read them in into gwyddion. In addition, the files can also be saved in the described .txt format but then are not readable by gwyddion. I want to use a python script for detailed analysis of the measurements but I can't seem to find a way to read in .mi with python 3 as easy as .txt files. Therefore I'm looking for a filestructure of .txt files which gwyddion can...
Are you trying to write the files in some existing file format, or is this an ad hoc format? Fields xPixels and yPixels look like from a Molecular Imaging/PicoView text format. The data look a bit like it. I have never seen xUnit, yUnit and Channels in MI files though. Gwyddion has a bunch of file import modules for various text exports (SPIP, Attocube, Accurex II, SDF/BCR, PLT, WSF), but they are generally single-channel. It seems we might read multiple channels from MI files. The data are, however,...
Are you trying to write the files in some existing file format, or is this an ad hoc format? Fields xPixels and yPixels look like from a Molecular Imaging/PicoView text format. The data look a bit like it. I have never seen xUnit, yUnit and Channels in MI files though. Gwyddion has a bunch of file import modules for various text exports (SPIP, Attocube, Accurex II, SDF/BCR, PLT, WSF), but they are generally single-channel. It seems we might read multiple channels from MI files. The data are, however,...
Hello, I have a really general problem with importing .txt files. I want to import .txt files into Gwyddion from some AFM measurements but I cannot seem to find a proper example how the file should be structure so I can read it in. To be more detailed, the file from AFM measurements look something like this: (Header with some basic info like:) xPixels 512 yPixels 512 xUnit µm yUnit µm Channels 4 .... (Measurement data of multiple channels:) TopographyTrace(µm) TopographyRetrace(µm) Amplitude(V) Phase(deg)...
- added magic comments
- added a couple more tiny grain special cases to the Laplace solver
- added a function special-casing tiny grains, currently 1×1
- updated 2.66 news a bit
- try to read binary wave records from PXP files and merge them to a file
- refactored igor_read_single() to read into a caller-supplied container from a caller-supplied
- removed debugging timer
- worked around slow laplace solver when running on a huge number of small-ish grains
- started adding support for Igor packed experiment (PXP) files
- a feeble attempt to add some conversion to physical values
Dear all, recently, when I open a spm file in Gwyddion, I find the unit of the Amplitude Error is wrong (see attached snaped image ). I am sure that during the measurement, the unit is in the mV, but not the V. Can someone help how to fix this? I am really appreciate that if someone could help me. Thank you a lot in advance
Dear all, recently, when I open a spm file in Gwyddion, I find the unit of the Amplitude Error is wrong (see attached snaped image ). I am sure that during the measurement, the unit is in the mV, but not the V. Can someone help how to fix this. I an really appreciate that if someone could help me. Thank you a lot in advance
- read both channels, if present
- moved header reading out of the main load function
- added a NO_REBASE filetype flag for files which look like volume/curve data, but are not
- improved file format detection, especially for version 0
- started adding support also for version 2
- read v0 frames from the correct positions in the file
- started adding support for version 0 of the file format
- do not include pyroscopy version attributes in metadata because they are fake anyway
If the points are in a regular grid and you use Create Image Directly in Rasterise then it has no effect. If you actually have to rasterise then some interpolation is done, but I do not think it can have an effect here.
If the points are in a regular grid and you use Create Image Directly in Rasterise then it has no effect. If you actually has to rasterise some interpolation is done, but I do not think it can have an effect here.
Just checked, and the curves look practically identical. I should probably add that I'm working with .obj files (coming from a 3D scanner). So after importing them, I use the tool "Rasterize XYZ data" for obtaining a processable image. But I don't think that procedure adds any error (as long as it is kept consistent, at least).
What shall I do with the row statistics tool? See the row averages as the 2D data might be quite noisy.
Selecting a same rectangle in the most flat region of the reference part could help with the odd features in the corners But it uses even less data. Yes, then data would be reduced. Also, I tried with other samples (having more even surfaces) and I didn't get any improvements. Since the upper part is larger, you can also alternatively try to level the surface using the upper part. The difference works either way, just the sign changes. The issue here is that the treatment might not cause a uniform...
The change seems pretty minor, so it will be very sensitive to the levelling/z-alignment – and the reference part is rather uneven. I might try to suggest a couple of things, but you are in danger of trying to get the results you want to see – as opposed to what the data say (which might not be much, at least not reliably) Selecting a same rectangle in the most flat region of the reference part could help with the odd features in the corners But it uses even less data. Since the upper part is larger,...
The change seems pretty minor, so it will be very sensitive to the levelling/z-alignment – and the reference part is rather uneven. I might try ti suggest a couple of things, but you are in danger of trying to get the results you want to see – as opposed to what the data say (which might not be much, at least not reliably) Selecting a same rectangle in the most flat region of the reference part could help with the odd features in the corners But it uses even less data. Since the upper part is larger,...
Thanks for your feedback. I'm attaching a pic, left map before the treatment and right map after. In both, the reference (always untreated) surface is in the bottom part, below that groove (which I always exclude from any calculation, because very noisy and because I don't need it). So yes, the reference is on one side only. I get credible results only if I do plane leveling from the entire map - which probably is incorrect.
What you write sounds reasonable (although I have some difficulties imaging it exactly without a picture). So the problem may be that the reference part is small/on one side/…, making the procedure ill-conditioned (i.e. numerically not stable)?
Hi all, I need to process 3D models of tiles at different stages of surface treatment. The part of the surface treated erodes (recedes), the untreated part is no affected in any way instead. I would like to keep the untreated surface as reference and to calculate the increasing height differences with the treated surface during a sequence of following treatments. All in all, the surfaces mapped approximate a simple plain. What is the correct way of leveling the maps, before any calculation? I assumed...
updated graph axes and other GUI aspects
- try to read physical dimensions from scanlist if we find a separate Slow Axis Size field
added parameters
most of the GUI working, no calculation yet
- fixed Match voxel size modes to actually match voxel sizes, not brick sizes
- fixed Match pixel size to actually match pixel size, not image size
- fixed inverted meaning of inverted for rank-based marking
- fixed swapped arguments of err_SIZE_MISMATCH()
- updated 2.66 news a bit
- added mark by rank option which marks given fraction of the pixels (as opposed to
- formatted to 120 columns
- disabled debugging
- suppress warnings about DIMENSION_LIST
- fixed leaking member_name in process_compound_attribute()
- add metadata also from parent levels
- noted the .aris file extension
- started adding metadata
- in detection, scan the file to a shallower depth first and only scan more deeeply when
- do not fail completely when some images cannot be read; apparently this sometimes happens
- added a simple module reading NSID HDF5 files
- separated datasets which are data scales into a different array during the file scan
- create metadata by rescanning the attribute groups
- moved metadata gathering to a function
- added a function for gathering USID dimension information
- fixed sometimes passing the wrong buffer to H5Rget_name()
- fixed the sign of mean background
- added background extraction
- fixed invalid freedesktop magic
- added inversion and polynomial degree options
- read arrays of fixed size strings
- process also dataset region references, for completeness
- implemented reference reading using the old 1.8 API
fixed bug in poly use
x extended to 7th order experimentally
x extended to 6th order experimentally
added 4th order polynomial terms
added 3rd order polynomial terms
- fixed a typo
- implemented more or less working physical dimensions and metadata
- changed signature of meta separator conversion function to simply take any Container
- keep path separated by / in ghfile.meta
- added a FIXME note for code whic requires new HDF5
- added forgotten Makefile.am
- progressing with physical units and scales
- need to pass size+1 as the size when reading reference names
- started EPFL HDF5 data import module
- read fixed string attributes directly into the GString when possible
- handled reference attributes by simply storing the target name
- added a missing file magic comment (not a file module)
- separated individual HDF5 based data formats into their own source files
- scan hdf5 subdir for file magic
- made the function declarations a bit more readable
- split HDF5 utility functions to gwyhdf5.c
- moved the HDF5 module into a subdirectory in preparation for splitting the monster
- small coding style changes
- escape < in freedesktop file magic
- removed old script
fixed bug in polynomials orientation
- added user guide magic comment for DATX
- added some metadata and plots vs. frame number
- make Update button insensitive when there is no kernel image to search
- started implementation of ASD high-speed AFM data format
- moved most of the tag sorting to gwytiff.c but we actually need some of the functions
- moved the GwyTIFF implementation to a C file we compile separately and link with all the