MamesJay
Member
- Joined
- Dec 21, 2009
- Messages
- 79
In another thread I was asking about the long-time storage of large amounts of data. It must be HDDs cause with the data consumption these days; DVDs are the new floppy discs.
This time it's about the XXL-Computer that sits at home and is holding insane amounts of data, present, available at a click of the mouse. Some would call it a home-server, I just don't like the term, it sounds to geeky for me. To me 'home-server' sounds more like somebody is just celebrating the capabilities of his system, instead of using them. But that's just me. I thought I mention the term so some people might get the idea.
If a user has about 100 or 200 gigabyte of data (most of it large and medium sized files like movies, shows and music), there is no need for an extensive structure and order. Those are laptop numbers. That is the amount of data where a manual search is still a funny little adventure. With two, three or even more Terabyte, it's a different story. It becomes a whole thing that is no fun anymore. The Windows internal search is a dry nightmare, even if the HDD has been indexed (which has to be a given anyway from a certain point on).
I discovered Google in the early 2000s, it was the greatest thing for me to discover the Internet. And it's still my #1 for any search. I'm totally used to the Google look and all. A while ago I heard about 'Google Desktop'. It was interesting but I didn't pay any more attention to it until I realized, that this application can completely index and search the text of documents and/or PDF files. And I have lots of documents. It's like a gigantic knowledge base, and the ability to have it indexed, so I can search INSIDE of ALL the documents for a word, in a split second...?! Getting every result from every document at once?! No more searching of each single document, and getting the results in a one-by-one manner, but getting multiple results in a Google look, with the searched word and passages, to look at the context?! The thought of it blew my mind, it's like a giant leap towards the Super-Computer I dreamed about. Does anyone have an iPad, I need a new coaster. :lol:
I just wondered how 'Google Desktop' handles large amounts of documents and PDF? Just indexing file names should be easy, the name of the file, the end. But how about, lets say, 200 gigabytes of documents? Does that mean that the index would be just as large? Is there an amount of data where the whole idea runs into problems? Maybe there are some people with an experience of handling large amounts of data with 'Google Desktop'.
Maybe I gave some people a new idea or perspective, I don't know.
It would be cool to have a system like that with a 5Tb HDD for data, and maybe a 320Gb SSD for running applications (getting the speed and all). One thing: I would never let this system on the Internet though. Cause of virus and malware, and then, having all my documents indexed. I might as well print fliers with my credit card number.
This time it's about the XXL-Computer that sits at home and is holding insane amounts of data, present, available at a click of the mouse. Some would call it a home-server, I just don't like the term, it sounds to geeky for me. To me 'home-server' sounds more like somebody is just celebrating the capabilities of his system, instead of using them. But that's just me. I thought I mention the term so some people might get the idea.
If a user has about 100 or 200 gigabyte of data (most of it large and medium sized files like movies, shows and music), there is no need for an extensive structure and order. Those are laptop numbers. That is the amount of data where a manual search is still a funny little adventure. With two, three or even more Terabyte, it's a different story. It becomes a whole thing that is no fun anymore. The Windows internal search is a dry nightmare, even if the HDD has been indexed (which has to be a given anyway from a certain point on).
I discovered Google in the early 2000s, it was the greatest thing for me to discover the Internet. And it's still my #1 for any search. I'm totally used to the Google look and all. A while ago I heard about 'Google Desktop'. It was interesting but I didn't pay any more attention to it until I realized, that this application can completely index and search the text of documents and/or PDF files. And I have lots of documents. It's like a gigantic knowledge base, and the ability to have it indexed, so I can search INSIDE of ALL the documents for a word, in a split second...?! Getting every result from every document at once?! No more searching of each single document, and getting the results in a one-by-one manner, but getting multiple results in a Google look, with the searched word and passages, to look at the context?! The thought of it blew my mind, it's like a giant leap towards the Super-Computer I dreamed about. Does anyone have an iPad, I need a new coaster. :lol:
I just wondered how 'Google Desktop' handles large amounts of documents and PDF? Just indexing file names should be easy, the name of the file, the end. But how about, lets say, 200 gigabytes of documents? Does that mean that the index would be just as large? Is there an amount of data where the whole idea runs into problems? Maybe there are some people with an experience of handling large amounts of data with 'Google Desktop'.
Maybe I gave some people a new idea or perspective, I don't know.
It would be cool to have a system like that with a 5Tb HDD for data, and maybe a 320Gb SSD for running applications (getting the speed and all). One thing: I would never let this system on the Internet though. Cause of virus and malware, and then, having all my documents indexed. I might as well print fliers with my credit card number.