|
Adding a large number of records to a file |
Iniciado por ragnar.nygard, 14,ene. 2004 11:41 - 1 respuesta |
| |
| | | |
|
| |
Publicado el 14,enero 2004 - 11:41 |
Hi I am trying to add a large number of records to a file (more than 1 million records to a file with millions of records) with several keys. Mostly composite keys none unique. Is there a way to prevent reindexing after every record is added? I am checking for existence of primary key and using HAdd if record is not found and HModify to update. Is there a better way to do this? |
| |
| |
| | | |
|
| | |
| |
Publicado el 16,enero 2004 - 09:30 |
If you create a hyper file from Analysis with keys defined and the file will be created automatically after generation and running from RAD. All index keys are well-built.
When applying HAdd, HModify, HDelete, all keys are automatically updated, no re-indexing process is needed.
It's quite unlikely to get Hyper file crashed because of its stabilities or just apply HReindex() if it does.
Regs King
Ragnar Nygård <ragnar.nygard@farveringen.no> wrote:
Hi I am trying to add a large number of records to a file (more than 1 million
records >to a file with millions of records) with several keys. Mostly composite keys none unique.
Is there a way to prevent reindexing after every record is added? I am checking for >existence of primary key and using HAdd if record is not found and HModify to update.
Is there a better way to do this?
|
| |
| |
| | | |
|
| | | | |
| | |
|