Header Data Units

The ImageHDU and CompImageHDU classes are discussed in the section on Images.

The TableHDU and BinTableHDU classes are discussed in the section on Tables.

PrimaryHDU

class astropy.io.fits.PrimaryHDU(data=None, header=None, do_not_scale_image_data=False, uint=False, scale_back=False) [edit on github]

Bases: astropy.io.fits.hdu.image._ImageBaseHDU

FITS primary HDU class.

add_checksum(when=None, override_datasum=False, blocking='standard') [edit on github]

Add the CHECKSUM and DATASUM cards to this HDU with the values set to the checksum calculated for the HDU and the data respectively. The addition of the DATASUM card may be overridden.

Parameters :

when : str, optional

comment string for the cards; by default the comments will represent the time when the checksum was calculated

override_datasum : bool, optional

add the CHECKSUM card only

blocking: str, optional :

“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not

Notes

For testing purposes, first call add_datasum with a when argument, then call add_checksum with a when argument and override_datasum set to True. This will provide consistent comments for both cards and enable the generation of a CHECKSUM card with a consistent value.

add_datasum(when=None, blocking='standard') [edit on github]

Add the DATASUM card to this HDU with the value set to the checksum calculated for the data.

Parameters :

when : str, optional

Comment string for the card that by default represents the time when the checksum was calculated

blocking: str, optional :

“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not

Returns :

checksum : int

The calculated datasum

Notes

For testing purposes, provide a when argument to enable the comment value in the card to remain consistent. This will enable the generation of a CHECKSUM card with a consistent value.

copy() [edit on github]

Make a copy of the HDU, both header and data are copied.

filebytes() [edit on github]

Calculates and returns the number of bytes that this HDU will write to a file.

Parameters :None :
Returns :Number of bytes :
fileinfo() [edit on github]

Returns a dictionary detailing information about the locations of this HDU within any associated file. The values are only valid after a read or write of the associated file with no intervening changes to the HDUList.

Parameters :

None :

Returns :

dictionary or None :

The dictionary details information about the locations of this HDU within an associated file. Returns None when the HDU is not associated with a file.

Dictionary contents:

Key

Value

file

File object associated with the HDU

filemode

Mode in which the file was opened (readonly, copyonwrite, update, append, ostream)

hdrLoc

Starting byte location of header in file

datLoc

Starting byte location of data block in file

datSpan

Data size including padding

classmethod fromstring(data, fileobj=None, offset=0, checksum=False, ignore_missing_end=False, **kwargs) [edit on github]

Creates a new HDU object of the appropriate type from a string containing the HDU’s entire header and, optionally, its data.

Note: When creating a new HDU from a string without a backing file object, the data of that HDU may be read-only. It depends on whether the underlying string was an immutable Python str/bytes object, or some kind of read-write memory buffer such as a memoryview.

Parameters :

data : str, bytearray, memoryview, ndarray

A byte string contining the HDU’s header and, optionally, its data. If fileobj is not specified, and the length of data extends beyond the header, then the trailing data is taken to be the HDU’s data. If fileobj is specified then the trailing data is ignored.

fileobj : file (optional)

The file-like object that this HDU was read from.

offset : int (optional)

If fileobj is specified, the offset into the file-like object at which this HDU begins.

checksum : bool (optional)

Check the HDU’s checksum and/or datasum.

ignore_missing_end : bool (optional)

Ignore a missing end card in the header data. Note that without the end card the end of the header can’t be found, so the entire data is just assumed to be the header.

kwargs : (optional)

May contain additional keyword arguments specific to an HDU type. Any unrecognized kwargs are simply ignored.

classmethod readfrom(fileobj, checksum=False, ignore_missing_end=False, **kwargs) [edit on github]

Read the HDU from a file. Normally an HDU should be opened with fitsopen() which reads the entire HDU list in a FITS file. But this method is still provided for symmetry with writeto().

Parameters :

fileobj : file object or file-like object

Input FITS file. The file’s seek pointer is assumed to be at the beginning of the HDU.

checksum : bool

If True, verifies that both DATASUM and CHECKSUM card values (when present in the HDU header) match the header and data of all HDU’s in the file.

ignore_missing_end : bool

Do not issue an exception when opening a file that is missing an END card in the last header.

req_cards(keyword, pos, test, fix_value, option, errlist) [edit on github]

Check the existence, location, and value of a required Card.

TODO: Write about parameters

If pos = None, it can be anywhere. If the card does not exist, the new card will have the fix_value as its value when created. Also check the card’s value by using the test argument.

run_option(option='warn', err_text='', fix_text='Fixed.', fix=None, fixable=True) [edit on github]

Execute the verification with selected option.

scale(type=None, option='old', bscale=1, bzero=0) [edit on github]

Scale image data by using BSCALE/BZERO.

Call to this method will scale data and update the keywords of BSCALE and BZERO in _header. This method should only be used right before writing to the output file, as the data will be scaled and is therefore not very usable after the call.

Parameters :

type : str, optional

destination data type, use a string representing a numpy dtype name, (e.g. 'uint8', 'int16', 'float32' etc.). If is None, use the current data type.

option : str

How to scale the data: if "old", use the original BSCALE and BZERO values when the data was read/created. If "minmax", use the minimum and maximum of the data to scale. The option will be overwritten by any user specified bscale/bzero values.

bscale, bzero : int, optional

User-specified BSCALE and BZERO values.

section

Access a section of the image array without loading the entire array into memory. The Section object returned by this attribute is not meant to be used directly by itself. Rather, slices of the section return the appropriate slice of the data, and loads only that section into memory.

Sections are mostly obsoleted by memmap support, but should still be used to deal with very large scaled images. See the Data Sections section of the PyFITS documentation for more details.

shape

Shape of the image array–should be equivalent to self.data.shape.

size

Size (in bytes) of the data portion of the HDU.

update_ext_name(value, comment=None, before=None, after=None, savecomment=False) [edit on github]

Update the extension name associated with the HDU.

If the keyword already exists in the Header, it’s value and/or comment will be updated. If it does not exist, a new card will be created and it will be placed before or after the specified location. If no before or after is specified, it will be appended at the end.

Parameters :

value : str

value to be used for the new extension name

comment : str, optional

to be used for updating, default=None.

before : str or int, optional

name of the keyword, or index of the Card before which the new card will be placed in the Header. The argument before takes precedence over after if both specified.

after : str or int, optional

name of the keyword, or index of the Card after which the new card will be placed in the Header.

savecomment : bool, optional

When True, preserve the current comment for an existing keyword. The argument savecomment takes precedence over comment if both specified. If comment is not specified then the current comment will automatically be preserved.

update_ext_version(value, comment=None, before=None, after=None, savecomment=False) [edit on github]

Update the extension version associated with the HDU.

If the keyword already exists in the Header, it’s value and/or comment will be updated. If it does not exist, a new card will be created and it will be placed before or after the specified location. If no before or after is specified, it will be appended at the end.

Parameters :

value : str

value to be used for the new extension version

comment : str, optional

to be used for updating, default=None.

before : str or int, optional

name of the keyword, or index of the Card before which the new card will be placed in the Header. The argument before takes precedence over after if both specified.

after : str or int, optional

name of the keyword, or index of the Card after which the new card will be placed in the Header.

savecomment : bool, optional

When True, preserve the current comment for an existing keyword. The argument savecomment takes precedence over comment if both specified. If comment is not specified then the current comment will automatically be preserved.

verify(option='warn') [edit on github]

Verify all values in the instance.

Parameters :

option : str

Output verification option. Must be one of "fix", "silentfix", "ignore", "warn", or "exception". See Verification options for more info.

verify_checksum(blocking='standard') [edit on github]

Verify that the value in the CHECKSUM keyword matches the value calculated for the current HDU CHECKSUM.

blocking: str, optional
“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not
Returns :

valid : int

  • 0 - failure
  • 1 - success
  • 2 - no CHECKSUM keyword present
verify_datasum(blocking='standard') [edit on github]

Verify that the value in the DATASUM keyword matches the value calculated for the DATASUM of the current HDU data.

blocking: str, optional
“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not
Returns :

valid : int

  • 0 - failure
  • 1 - success
  • 2 - no DATASUM keyword present
writeto(name, output_verify='exception', clobber=False, checksum=False) [edit on github]

Write the HDU to a new file. This is a convenience method to provide a user easier output interface if only one HDU needs to be written to a file.

Parameters :

name : file path, file object or file-like object

Output FITS file. If opened, must be opened for append (“ab+”)).

output_verify : str

Output verification option. Must be one of "fix", "silentfix", "ignore", "warn", or "exception". See Verification options for more info.

clobber : bool

Overwrite the output file if exists.

checksum : bool

When True adds both DATASUM and CHECKSUM cards to the header of the HDU when written to the file.

GroupsHDU

class astropy.io.fits.GroupsHDU(data=None, header=None, name=None) [edit on github]

Bases: astropy.io.fits.hdu.image.PrimaryHDU, astropy.io.fits.hdu.table._TableLikeHDU

FITS Random Groups HDU class.

add_checksum(when=None, override_datasum=False, blocking='standard') [edit on github]

Add the CHECKSUM and DATASUM cards to this HDU with the values set to the checksum calculated for the HDU and the data respectively. The addition of the DATASUM card may be overridden.

Parameters :

when : str, optional

comment string for the cards; by default the comments will represent the time when the checksum was calculated

override_datasum : bool, optional

add the CHECKSUM card only

blocking: str, optional :

“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not

Notes

For testing purposes, first call add_datasum with a when argument, then call add_checksum with a when argument and override_datasum set to True. This will provide consistent comments for both cards and enable the generation of a CHECKSUM card with a consistent value.

add_datasum(when=None, blocking='standard') [edit on github]

Add the DATASUM card to this HDU with the value set to the checksum calculated for the data.

Parameters :

when : str, optional

Comment string for the card that by default represents the time when the checksum was calculated

blocking: str, optional :

“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not

Returns :

checksum : int

The calculated datasum

Notes

For testing purposes, provide a when argument to enable the comment value in the card to remain consistent. This will enable the generation of a CHECKSUM card with a consistent value.

copy() [edit on github]

Make a copy of the HDU, both header and data are copied.

data

The data of a random group FITS file will be like a binary table’s data.

filebytes() [edit on github]

Calculates and returns the number of bytes that this HDU will write to a file.

Parameters :None :
Returns :Number of bytes :
fileinfo() [edit on github]

Returns a dictionary detailing information about the locations of this HDU within any associated file. The values are only valid after a read or write of the associated file with no intervening changes to the HDUList.

Parameters :

None :

Returns :

dictionary or None :

The dictionary details information about the locations of this HDU within an associated file. Returns None when the HDU is not associated with a file.

Dictionary contents:

Key

Value

file

File object associated with the HDU

filemode

Mode in which the file was opened (readonly, copyonwrite, update, append, ostream)

hdrLoc

Starting byte location of header in file

datLoc

Starting byte location of data block in file

datSpan

Data size including padding

classmethod fromstring(data, fileobj=None, offset=0, checksum=False, ignore_missing_end=False, **kwargs) [edit on github]

Creates a new HDU object of the appropriate type from a string containing the HDU’s entire header and, optionally, its data.

Note: When creating a new HDU from a string without a backing file object, the data of that HDU may be read-only. It depends on whether the underlying string was an immutable Python str/bytes object, or some kind of read-write memory buffer such as a memoryview.

Parameters :

data : str, bytearray, memoryview, ndarray

A byte string contining the HDU’s header and, optionally, its data. If fileobj is not specified, and the length of data extends beyond the header, then the trailing data is taken to be the HDU’s data. If fileobj is specified then the trailing data is ignored.

fileobj : file (optional)

The file-like object that this HDU was read from.

offset : int (optional)

If fileobj is specified, the offset into the file-like object at which this HDU begins.

checksum : bool (optional)

Check the HDU’s checksum and/or datasum.

ignore_missing_end : bool (optional)

Ignore a missing end card in the header data. Note that without the end card the end of the header can’t be found, so the entire data is just assumed to be the header.

kwargs : (optional)

May contain additional keyword arguments specific to an HDU type. Any unrecognized kwargs are simply ignored.

parnames

The names of the group parameters as described by the header.

classmethod readfrom(fileobj, checksum=False, ignore_missing_end=False, **kwargs) [edit on github]

Read the HDU from a file. Normally an HDU should be opened with fitsopen() which reads the entire HDU list in a FITS file. But this method is still provided for symmetry with writeto().

Parameters :

fileobj : file object or file-like object

Input FITS file. The file’s seek pointer is assumed to be at the beginning of the HDU.

checksum : bool

If True, verifies that both DATASUM and CHECKSUM card values (when present in the HDU header) match the header and data of all HDU’s in the file.

ignore_missing_end : bool

Do not issue an exception when opening a file that is missing an END card in the last header.

req_cards(keyword, pos, test, fix_value, option, errlist) [edit on github]

Check the existence, location, and value of a required Card.

TODO: Write about parameters

If pos = None, it can be anywhere. If the card does not exist, the new card will have the fix_value as its value when created. Also check the card’s value by using the test argument.

run_option(option='warn', err_text='', fix_text='Fixed.', fix=None, fixable=True) [edit on github]

Execute the verification with selected option.

scale(type=None, option='old', bscale=1, bzero=0) [edit on github]

Scale image data by using BSCALE/BZERO.

Call to this method will scale data and update the keywords of BSCALE and BZERO in _header. This method should only be used right before writing to the output file, as the data will be scaled and is therefore not very usable after the call.

Parameters :

type : str, optional

destination data type, use a string representing a numpy dtype name, (e.g. 'uint8', 'int16', 'float32' etc.). If is None, use the current data type.

option : str

How to scale the data: if "old", use the original BSCALE and BZERO values when the data was read/created. If "minmax", use the minimum and maximum of the data to scale. The option will be overwritten by any user specified bscale/bzero values.

bscale, bzero : int, optional

User-specified BSCALE and BZERO values.

section

Access a section of the image array without loading the entire array into memory. The Section object returned by this attribute is not meant to be used directly by itself. Rather, slices of the section return the appropriate slice of the data, and loads only that section into memory.

Sections are mostly obsoleted by memmap support, but should still be used to deal with very large scaled images. See the Data Sections section of the PyFITS documentation for more details.

shape

Shape of the image array–should be equivalent to self.data.shape.

size

Returns the size (in bytes) of the HDU’s data part.

update_ext_name(value, comment=None, before=None, after=None, savecomment=False) [edit on github]

Update the extension name associated with the HDU.

If the keyword already exists in the Header, it’s value and/or comment will be updated. If it does not exist, a new card will be created and it will be placed before or after the specified location. If no before or after is specified, it will be appended at the end.

Parameters :

value : str

value to be used for the new extension name

comment : str, optional

to be used for updating, default=None.

before : str or int, optional

name of the keyword, or index of the Card before which the new card will be placed in the Header. The argument before takes precedence over after if both specified.

after : str or int, optional

name of the keyword, or index of the Card after which the new card will be placed in the Header.

savecomment : bool, optional

When True, preserve the current comment for an existing keyword. The argument savecomment takes precedence over comment if both specified. If comment is not specified then the current comment will automatically be preserved.

update_ext_version(value, comment=None, before=None, after=None, savecomment=False) [edit on github]

Update the extension version associated with the HDU.

If the keyword already exists in the Header, it’s value and/or comment will be updated. If it does not exist, a new card will be created and it will be placed before or after the specified location. If no before or after is specified, it will be appended at the end.

Parameters :

value : str

value to be used for the new extension version

comment : str, optional

to be used for updating, default=None.

before : str or int, optional

name of the keyword, or index of the Card before which the new card will be placed in the Header. The argument before takes precedence over after if both specified.

after : str or int, optional

name of the keyword, or index of the Card after which the new card will be placed in the Header.

savecomment : bool, optional

When True, preserve the current comment for an existing keyword. The argument savecomment takes precedence over comment if both specified. If comment is not specified then the current comment will automatically be preserved.

verify(option='warn') [edit on github]

Verify all values in the instance.

Parameters :

option : str

Output verification option. Must be one of "fix", "silentfix", "ignore", "warn", or "exception". See Verification options for more info.

verify_checksum(blocking='standard') [edit on github]

Verify that the value in the CHECKSUM keyword matches the value calculated for the current HDU CHECKSUM.

blocking: str, optional
“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not
Returns :

valid : int

  • 0 - failure
  • 1 - success
  • 2 - no CHECKSUM keyword present
verify_datasum(blocking='standard') [edit on github]

Verify that the value in the DATASUM keyword matches the value calculated for the DATASUM of the current HDU data.

blocking: str, optional
“standard” or “nonstandard”, compute sum 2880 bytes at a time, or not
Returns :

valid : int

  • 0 - failure
  • 1 - success
  • 2 - no DATASUM keyword present
writeto(name, output_verify='exception', clobber=False, checksum=False) [edit on github]

Write the HDU to a new file. This is a convenience method to provide a user easier output interface if only one HDU needs to be written to a file.

Parameters :

name : file path, file object or file-like object

Output FITS file. If opened, must be opened for append (“ab+”)).

output_verify : str

Output verification option. Must be one of "fix", "silentfix", "ignore", "warn", or "exception". See Verification options for more info.

clobber : bool

Overwrite the output file if exists.

checksum : bool

When True adds both DATASUM and CHECKSUM cards to the header of the HDU when written to the file.

GroupData

class astropy.io.fits.GroupData [edit on github]

Bases: astropy.io.fits.fitsrec.FITS_rec

Random groups data object.

Allows structured access to FITS Group data in a manner analogous to tables.

par(parname) [edit on github]

Get the group parameter values.

Group

class astropy.io.fits.GroupData [edit on github]

Bases: astropy.io.fits.fitsrec.FITS_rec

Random groups data object.

Allows structured access to FITS Group data in a manner analogous to tables.

par(parname) [edit on github]

Get the group parameter values.

StreamingHDU

class astropy.io.fits.StreamingHDU(name, header) [edit on github]

Bases: object

A class that provides the capability to stream data to a FITS file instead of requiring data to all be written at once.

The following pseudocode illustrates its use:

header = astropy.io.fits.Header()

for all the cards you need in the header:
    header[key] = (value, comment)

shdu = astropy.io.fits.StreamingHDU('filename.fits', header)

for each piece of data:
    shdu.write(data)

shdu.close()
close() [edit on github]

Close the physical FITS file.

size

Return the size (in bytes) of the data portion of the HDU.

write(data) [edit on github]

Write the given data to the stream.

Parameters :

data : ndarray

Data to stream to the file.

Returns :

writecomplete : int

Flag that when True indicates that all of the required data has been written to the stream.

Notes

Only the amount of data specified in the header provided to the class constructor may be written to the stream. If the provided data would cause the stream to overflow, an IOError exception is raised and the data is not written. Once sufficient data has been written to the stream to satisfy the amount specified in the header, the stream is padded to fill a complete FITS block and no more data will be accepted. An attempt to write more data after the stream has been filled will raise an IOError exception. If the dtype of the input data does not match what is expected by the header, a TypeError exception is raised.