Search Results

Search found 35173 results on 1407 pages for 'critical path software'.

Page 611/1407 | < Previous Page | 607 608 609 610 611 612 613 614 615 616 617 618  | Next Page >

  • Photoshop CS5 performance over network drive (cifs)

    - by grub
    Hello Everyone I did install a QNAP NAS TS410 for a customer (professional photographer) with 3 Hitachi Deskstar 7200rpm 2TB disk configured as RAID5. The NAS and the workstations are connected over a Gigabit network. He and his co-worker are accessing the photos (about 1TB of photos) over a mapped network drive from their windows machines (Windows XP - 32bit and Windows 7 Ultimate - 32bit). Both are using Photoshop CS5 to edit the photos. The problem is that to save a edited photo takes a really long time, it takes about 3 times as long to save a photo as to open it. After some tests I can exclude the network, the NAS and the windows machines as source of the issue. I think the problem is the Photoshop software and its handling of the network drives. Officially network drives are not supported by Adobe. I do not have any experience with the Adobe products, especially with Adobe Photoshop CS5. What are your recommendation to solve the performance issue? Should my customer copy the photos to the local drive, edit them and upload them again to the network drive or is Adobe Drive or Adobe Version Cue the answer? One requirement is that the photos need to be accessible / editable from both computers even when one of them is offline. Adobe Version Cue needs a dedicated service running to be usable, so this solution is not possible as far as I understand the Cue software. Thank you for your input to this issue and have a nice day :-) Greetings grub

    Read the article

  • Nginx proxy to IIS Connection Timeout

    - by MitMaro
    I am having an issue with random timeouts with a Nginx proxy connecting to an IIS machine. I have been watching a packet capture between the two servers and it seems that the IIS machine is receiving a SYN packet but is not responding with what I think should be an ACK response. Before the timeout occurs there seems to be a slower response from the IIS server. There is no unusual memory or processor usage on the IIS or Nginx machine. Some information on the servers and setup: Nginx Machine: Ubuntu 10.04 64bit Nginx 0.7.65 Amazon EC2 Windows Machine: Windows Server 2008 IIS 7 ASP.net Application in Integrated Mode Nginx Error: 2011/01/10 17:57:40 [error] 8297#0: *30 connect() failed (110: Connection timed out) while connecting to upstream, client: 209.***.***.***, server: secure.example.com, request: "GET /a/path/deliver.aspx HTTP/1.1", upstream: "http://***.***.***.****:****//another/path/deliver.aspx", host: "secure.example.com" WireShark Packets 6521.449528 10.***.***.*** -> 174.***.***.*** TCP 38695 > us-cli [SYN] Seq=0 Win=5840 Len=0 MSS=1460 TSV=477422103 TSER=0 WS=7 6524.443239 10.***.***.*** -> 174.***.***.*** TCP 38695 > us-cli [SYN] Seq=0 Win=5840 Len=0 MSS=1460 TSV=477422403 TSER=0 WS=7 6530.443241 10.***.***.*** -> 174.***.***.*** TCP 38695 > us-cli [SYN] Seq=0 Win=5840 Len=0 MSS=1460 TSV=477423003 TSER=0 WS=7

    Read the article

  • How to backup a large FreeNAS?

    - by Ze'ev
    We have a 12TB FreeNAS box in the office, and are looking for a way to keep a backup of it offsite. We're considering (1) tape; (2) a bunch of bare drives (popped into a spare hotswap bay); (3) external drives. Any advice on which solution is best? (Online backup is not an option because our internet connection is too slow.) And, is there some software that will keep track of which files have been backed up and which haven't? So that when one backup unit fills up, we can continue the backup on the next? (We don't want to have to back up to a 12TB device.) This software could run, preferably, on the NAS itself; or from one of our Mac clients. Our goal is a situation where we attach some backup device; it automatically fills up with stuff from the server; the contents of this unit are catalogued somewhere something prompts us to replace with a fresh drive/tape; backup continues until full, including any files that have changed since being backed up.

    Read the article

  • Windows 7 64-bit Google Chrome Makes Desktop Unresponsive

    - by Meengla
    For the past 1-2 weeks my desktop becomes unresponsive sometimes. The problem seems to happen as I load a page in Google Chrome; it could be any page though today it happened when I tried to load a page with Flash content in it. The problem is not consistent and there is no pattern. I can do a virus scan using some software (not antivirus installed--never needed that) but I strongly suspect it is some software I installed or some Windows Update. This is a very powerful Dell Optiplex 990 with 16GB RAM, so Memory shouldn't be an issue. When the problem happens the cursor becomes spinning even over the taskbar and control alt delete takes a long time. Eventually I get a message that 'this program is not responding' with end/close and 'cancel' icon. But repeated end or cancel does nothing. Then the menu for ctrl alt del comes up. The rest of the applications keep running fine though I can't get to them because the cursor is the wait-cursor. What is happening? How can I find what exactly caused the problem? Thanks.

    Read the article

  • Removing Paths/ Landing Pages From SharePoint Search Results

    - by j.strugnell
    Hi there, We've been asked by a client to remove a number of pages from being shown up in their public website search results page. I've been into the SSP and created Crawl Rules to remove these pages. All seemed to have worked ok but we have an issue in that landing pages are still showing up in their "www.domain.com/sitearea/" form but not in their "www.domain.com/sitearea/pages/default.aspx". For each of this type of page we have created one rule to "Exclude" the "aspx" path and another rule to include the "/" path but to "Follow links on the URL without crawling the URL itself". We tried adding rules to exclude the "/" format but that only resulted in all results underneath that being excluded. Does anybody know how to remove the "area/pages/default.aspx" and the "area/" pats from Search Results? I'm not sure if it's the "done thing" to ask 2 questions in one but this is in a similar vein so it should be ok. I was wondering if anyone knew of a tool (or if it is possible) to allow site admins to exclude pages from search results (not via SSP/Crawl Rules). I know they can do it at the site level but I was wondering if anything out there enabled this to be done at the page level through either Page or Site Settings?

    Read the article

  • Why am I not able to create a backup plan for TFS?

    - by noocyte
    I am trying to create a backup plan using the TFS Power Tools but I keep running into this error message: I have checked that the account has Full Control on the share, I can edit, create and delete files there. From the log: [Info @07:15:00.403] Starting creating backup test validation [Error @07:15:00.700] Microsoft.SqlServer.Management.Smo.FailedOperationException: Backup failed for Server 'WMSI003714N\SqlExpress'. ---> Microsoft.SqlServer.Management.Common.ExecutionFailureException: An exception occurred while executing a Transact-SQL statement or batch. ---> System.Data.SqlClient.SqlException: Cannot open backup device '\\wmsi003714n\sql dump\Tfs_Configuration_20100910091500.bak'. Operating system error 5(failed to retrieve text for this error. Reason: 1815). BACKUP DATABASE is terminating abnormally. at Microsoft.SqlServer.Management.Common.ConnectionManager.ExecuteTSql(ExecuteTSqlAction action, Object execObject, DataSet fillDataSet, Boolean catchException) at Microsoft.SqlServer.Management.Common.ServerConnection.ExecuteNonQuery(String sqlCommand, ExecutionTypes executionType) --- End of inner exception stack trace --- at Microsoft.SqlServer.Management.Common.ServerConnection.ExecuteNonQuery(String sqlCommand, ExecutionTypes executionType) at Microsoft.SqlServer.Management.Common.ServerConnection.ExecuteNonQuery(StringCollection sqlCommands, ExecutionTypes executionType) at Microsoft.SqlServer.Management.Smo.ExecutionManager.ExecuteNonQuery(StringCollection queries) at Microsoft.SqlServer.Management.Smo.BackupRestoreBase.ExecuteSql(Server server, StringCollection queries) at Microsoft.SqlServer.Management.Smo.Backup.SqlBackup(Server srv) --- End of inner exception stack trace --- at Microsoft.SqlServer.Management.Smo.Backup.SqlBackup(Server srv) at Microsoft.TeamFoundation.PowerTools.Admin.Helpers.BackupFactory.TestBackupCreation(String path) [Error @07:15:00.731] !Verify Error!: Account GROUPINFRA\SA-NO-TeamService failed to create backups using path \\wmsi003714n\sql dump [Info @07:15:00.731] "Verify: Grant Backup Plan Permissions\Root\VerifyDummyBackupCreation(VerifyTestBackupCreatedSuccessfully): Exiting Verification with state Completed and result Error" Any ideas?

    Read the article

  • While using an ntfs smb share for mac users, do symbolic links and extended attributes work?

    - by scape
    We have a majority of mac users but we'd rather support their file sharing using a Windows server with an ntfs drive, or at least a Linux server with ext3. We've had trouble, much trouble, utilizing the OS X server software and after the years are now looking to abandon it. What's mostly holding us back is the fact that the mac users very often utilize symbolic links and other special features that exist for an HFS+ partition. The shared locations are mostly primary storage and not just used as an archive storage location. While there is an option to create symbolic links under ntfs, I'm curious if there is anything I need to look out for if I were to move the files over to a new partition that's hosted from a Windows server from the HFS+ partition; in addition, how well creating a symbolic link from a mac might work. I am also worried about windows backup software and if it will ruin these special sym links, and how placing permissions on sub-folders will work. Alternatively I could remotely backup the files using a mac and Bru, nonetheless I still want to get away from mac server for hosting the shares.

    Read the article

  • Transfering Files to server IP and port

    - by Mason
    I need to transfer files from my local computer on windows 7 to a server running linux. I access the server with putty through ssh at a specific IPv4 address and port number. I've attempted using the pscp command from my local computer but was denied access by the server. "Fatal: Network error: Connection refused" c:>pscp test.csv userid@**IPv4_Addres***:Port# /path/destination_file_name. Either the server blocks all pscp attempts from unauthorized users (most likely my laptop included) or I used the command incorrectly. If you have experience using this command, where exactly will the file get transfered to, I'm assuming that the path destination starts at my home directory in the server. Also if you have any other alternative methods of transfering the files let me know. Update 1 I have also tried using WinSCP however I got permission denied for that as well, it looks like the server will not let me upload or save files. Solved I had a complete lapse of memory and forgot about sudo (spent too much time with scripts the last 2 months), so I was able to change the permissions to allow external editing. Thanks for all the help guys!

    Read the article

  • Juniper NetScreen NS-5GT traffic monitoring

    - by blah
    I've done casual research into the subject and am truly dismayed at the lack of compatible tools for such a simple task. Maybe someone can provide assistance. We have a NetScreen NS-5GT in the office. I need to be able to get a glance of current traffic per endpoint -- I think the equivalent of 'get sessions' with byte counts/rates. I don't care about bars, graphs, and reports. Something as simple as a classic software firewall display would be perfect. I can't shell out money on something real like SolarWinds products, so a free solution is essential. I'm willing to do a little work but refuse to program something from scratch. It's not prudent right now for me to install a hub or otherwise mess around physically. There must be something out there I can use, maybe in combination. I don't believe I'm asking too much. Specific answers only please, e.g. monitoring software you know will actually work with this antiquated device. I've read about general approaches to the broader problem dozens of times already.

    Read the article

  • How to resize root LVM partition in Fedora without LiveCD or Rebooting

    - by Cerin
    I have a virtual machine that recently had its disk image increased from 20GB to 50GB, and fdisk -l verifies that the VM can see this new size. Now I need to resize my root LVM partition to fill the extra 30GB. I've found several articles about resizing LVM, but the few that cover resizing the root partition all claim you need to boot from a LiveCD. Is there any way to do this without taking down the server? The server is critical, so I'd like to minimize downtime.

    Read the article

  • Samba access works with IP address only

    - by Sebastian Rittau
    I added a Debian etch host (hostname: webserver, IP address: 192.168.101.2) running Samba to a Windows network with a Windows 2003 PDC (IP address 192.168.101.3). The Samba server exports a public guest share, called "Intranet". The server shows up fine in the network, but trying to click on it produces an error dialog, stating I don't have the necessary permissions. So does entering \webserver manually and using \webserver\internet states that the path does not exist. Interestingly, accessing the share by IP address (\192.168.101.2 or \192.168.101.2\intranet) works fine. DNS is configured correctly, and "smbclient //webserver/intranet" on another Linux client works fine. One complicating issue is that the webserver is only a VMware virtual machine running on PDC server. Here is our smb.conf: [global] workgroup = Foobar server string = Webserver wins support = yes ; commenting out these wins server = 192.168.101.3 ; two lines has no effect dns proxy = no guest account = nobody [... snipped some unrelated bits, like logging ...] security = share [... snipped some password-related things ...] domain master = no [intranet] comment = Intranet path = /srv/webserver/contents browseable = yes guest ok = yes guest only = yes read only = yes create mask = 0775 directory mask = 0775

    Read the article

  • Mac OS X: Update Python for Shell

    - by Nathan G.
    So, I see similar questions, but none of the answers work for me. I updated Python to 3.1.3 from 2.6.1. Everything works, except: When I type python into Terminal, I get: Python 2.6.1 (r261:67515, Jun 24 2010, 21:47:49) [GCC 4.2.1 (Apple Inc. build 5646)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> So, how do I change the version of Python that runs in the Shell? I've tried the script that they provide. It adds their directory to my $PATH, but it still doesn't change the version that'd displayed from Terminal. Here's what I get when I echo $PATH: /Library/Frameworks/Python.framework/Versions/3.1/bin:/Library/Frameworks/Python.framework/Versions/3.1/bin:/Library/Frameworks/Python.framework/Versions/3.1/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin:/usr/X11/bin It appears that the script provided has added their directory for every time I ran the script (I tried it a few times, naturally). I'll gives links to caps of what is in the other relevant folders it mentions: /Library/Frameworks/Python.framework/Versions/3.1/bin /usr/local/bin /usr/bin Thakns in advance for any ideas!

    Read the article

  • How to locate phpmyadmin on ubuntu

    - by Chris
    Okay, I'm usually a windows user and I write quite happily there, unfortunately (or fortunately) I have installed linux on a dual boot and having installed some software I have a question... Where is it? I installed Apache, PHP, MySQL and separately phpmyadmin, Apache is up and running, I've seen my phpinfo page and MySQL is there. MySQL is telling me that there's a database for phpmyadmin, but...erm.. I can't seem to locate it. On a windows machine the directory would be in the www directory and I'd just navigate there... localhost/phpmyadmin/ but on Ubuntu I can't find it in the equivalent. I've been to /var/www/ and there's my index.html (from apache) and my phptest.php file but no phpmyadmin. There is a phpmyadmin in /lib but that only has 2 files in it. So having rambled lots, my question is, what do I have to do to be able to navigate to the phpmyadmin index page? I realise this could fall under the description of a server related question and should be posted elsewhere but as it's software on a home system some help would be appreciated. Do I need to move some files from somewhere? Help! I really don't want to have to go back to developing on Windows as I'll be deploying to a lamp system, my learning curve will be steep.

    Read the article

  • Force logoff of idle users on Windows 7 workstation with fast user switching enabled

    - by newmanth
    We have mission-critical Windows 7 workstations on our network that must be available to any user at any time, even when it has been locked by a prior user. Thus, we have fast user switching enabled. Unfortunately, it's not unusual for us to have a dozen or more different users logged onto the same machine at the same time, with a corresponding degradation in service. We've done our best at educating the masses to log off at the end of their shift. But users being users, this does not happen on a consistent basis. Does anyone know of a clean way to force logoff idle users after a certain amount of time has elapsed? I am open to any method that could be deployed/configured via script, GPO, or SCCM.

    Read the article

  • Setup a automatic server reboot on when a particular service fails

    - by user1179459
    I am running linux based server (centos 6.0) with cpanle and WHM, I have critical website running with a chat server which uses a openfire as the chat server backend server, i have monitored last few weeks this service crashes quite often, i have no way of knowing that, and i have to wait till the next day to restart the server. (and this can only be fixed by using server reboot as its got to do with some java memory problem) is there a way i can setup a monitoring service to the server and if this service goes down server itself will reboot ? is this something possible or is there a better way to overcome this problem ?

    Read the article

  • Converting an ancient RH8 system to VMware ESXi

    - by donatello
    I am curious to know what options I have to convert a very old RedHat8 machine to a virtual one on ESXi. Looking at VMware Converter it seems there's an option to login to the RH8 using SSH, and from there it will convert to the ESXi-server. That makes me a bit nervous though, exactly what is happening there? The RH8 machine is slightly critical, and if anything messes up it'll likely result in many hours extra work. :( Another option I thought of was to boot a LiveCD on RH8-system and create a raw "dd dump" of the disk. The similar method is used to restore the image, I boot a LiveCD on the VM in ESXi and use "dd" to write it to disk. Is there any other option I could use? I'm using the cheap version of ESXi, hence I have no access to the Converter BootCD so these rather cumbersome methods is the only I can think of. :)

    Read the article

  • Freshly installed dd-wrt on dir-300 and no internet. What to do?

    - by Erik B
    With the d-link firmware I just connected the router and it worked, as is expected when using DHCP, but with dd-wrt I have no internet access. It is configured with DHCP, and dd-wrt's wan status page reports that it is connected and that it has an ip address. Yet it is impossible to reach the internet. If I disconnect the router and plugs the cable directly into my computer I get internet access, so it's obviously the dd-wrt software that isn't doing its job. However, I have no previous experience with the dd-wrt software and have no clue what to look for. I thought it would just work. By the way, the power led is orange, the internet led is off, and wireless+lan1 is green. They all used to be green with the d-link firmware. Not sure if it's relevant, but now you know. Does anyone have any idea what I should do to get internet access (besides reinstalling d-link's firmware)? EDIT: It's a version B1. I read that it is very different from the A1 version, so I thought it may be relevant.

    Read the article

  • IIS7 can't read web.config on shared Mac filesystem

    - by RobG
    I'm running a VirtualBox virtualized Windows 2008 Server on my Mac, just finished setting it up today. On it, I have SQL Server 2008, IIS and ColdFusion 9. I want to serve websites from my Mac filesystem (for development purposes). So I created a new website in IIS and pointed it at the appropriate path using a UNC path: \vboxsvr\rob\Sites\testsite, which contains the ColdFusion code and a web.config file. When I attempt to modify the file at all, or view the site in a web browser, I get an error: HTTP 500.19 - Internal Server Error The requested page cannot be accessed because the related configuration data for the page is invalid. I did some Googling, and found several similar problems, but nothing exactly like I have. The closest one seemed to indicate permissions. So I recreated the site and set it up to allow the Administrator (in Windows) to access the stuff. That didn't help. I can read/modify the files just fine from within Windows, but IIS itself can't seem to do it. What do I need to do to fix this? Thanks!

    Read the article

  • Virtualbox Networking: XP Guest, Ubuntu Host: Connecting to Windows servers & local network?

    - by user51833
    Here's what I have: Windows XP running in VirtualBox 3.0.8_OSE r53138; Host OS = Ubuntu 9.10 "Karmic Koala"; Windows network in my office with smb fileservers; Guest OS is connected to the internet and is sharing folders with Host OS; Limited networking expertise. Here's what I actually need to do: Use MS Outlook in my XP guest with all its calendar-sharing features and stuff (if this is all done through the internet then great) - or find a Linux app that can do the same stuff; Map Windows network servers, eg. smb://server01/ in my XP guest (I can already access these in Ubuntu. Here's what I've tried with no luck: Entering the server address (example above) in my XP guest windows explorer address bar (got a "could not access the file, path or drive" error message - maybe if I could enter login/pass information? But I don't know how); Mapping the server as a network drive (Windows could not find the path); Mounting the server as one of my shared folders (I couldn't find it through the shared folders browser in VirtualBox - is there somewhere in the Linux filesystem that Ubuntu keeps links to mounted servers?).

    Read the article

  • Using the same Windows 8 Upgrade installer on multiple PCs

    - by Karan
    As per this article: You may transfer the software to another computer that belongs to you. … You may not transfer the software to share licenses between computers. But what if I have a bunch of PCs with a mix of XP/Vista/Windows 7? Can I purchase either the Windows 8 Pro Upgrade $40 (download only) or $70 (DVD) version (both of which come without a key) only once and use it to upgrade all the PCs? Since I'm not sharing the license and each PC has its own valid genuine license, it should be allowed, right, or is it illegal? Even if they want people to shell out $40/$70 for each PC, how would they enforce the use of the installer/media on only one PC each? EDIT: I have been given to believe by a source that the installer will only check for the previous OS' key, which is what is confusing me (I have never purchased an upgrade version before this, only full retail or pre-installed versions). Is this true or will I need to enter two keys to make the upgrade work, one for the previous version and then one for Windows 8? If the latter is the case, then the issue is solved since obviously the same Windows 8 key will not be valid for multiple PCs.

    Read the article

  • Why such a dramatic difference in wireless router max. simultaneous connections?

    - by Jez
    Recently, I've needed to look into buying a wireless router for a mission-critical system at work that will need to support quite a few simultaneous connections (potentially a few hundred laptops). One thing I've noticed is that there seems to be a dramatic difference between the max. simultaneous connections different routers can support; see this page for example - anything from 32 to 35,000! Why is there this degree of difference? You'd have thought that if we know how to make routers that can handle thousands of connections, we wouldn't be making stuff that's limited to a pathetic 32 anymore. Is it a firmware thing? A hardware thing? Are low-end manufacturers purposely putting low arbitrary connection limits in so people can be "encouraged" to pay more for high-end routers?

    Read the article

  • Does using a hexacore CPU make sense?

    - by Exa
    I'm currently planning to upgrade my computer system and I want to exchange CPU, board and RAM. I already had a look at some hexacore-CPUs from AMD and would like to know if it makes any sense to use such a CPU with six cores. Is there any software which really uses six cores? Especially in gaming? I'm using this PC mostly for gaming and from time to time for developing. I know that on the dual-core system (2 x 3GHz) I currently use, Visual Studio creates two instances of the compiler, one for each core. Would there be six instances of the compiler on a hexacore system for super fast compiling? Is there any software that uses six cores? Would running two applications cause the usage of more CPUs? (For example two CPUs for a game you're playing while two other CPUs are used for compiling at the same time) I hope someone can point out the benefits of a hexacore system. The OS would be Windows 7 64 Bit and I use the PC for gaming most of the time. (Crysis 2, CoD, stuff like that)

    Read the article

  • How to keep subtree removal (`rm -rf`) from starving other processes for Disk I/O?

    - by David Eyk
    We have a very large (multi-GB) Nginx cache directory for a busy site, which we occasionally need to clear all at once. I've solved this in the past by moving the cache folder to a new path, making a new cache folder at the old path, and then rm -rfing the old cache folder. Lately, however, when I need to clear the cache on a busy morning, the I/O from rm -rf is starving my server processes of disk access, as both Nginx and the server it fronts for are read-intensive. I can watch the load average climb while the CPUs sit idle and rm -rf takes 98-99% of Disk IO in iotop. I've tried ionice -c 3 when invoking rm, but it seems to have no appreciable effect on the observed behavior. Is there any way to tame rm -rf to share the disk more? Do I need to use a different technique that will take its cues from ionice? Update: The filesystem in question is an AWS EC2 instance store (the primary disk is EBS). The /etc/fstab entry looks like this: /dev/xvdb /mnt auto defaults,nobootwait,comment=cloudconfig 0 2

    Read the article

  • Enabling SFTP Access within PLESK

    - by spelley
    I have a client who wants to ensure his upload is secure, so we are trying to enable SFTP for him on our Linux PLESK server. I have enabled SSH access to bin/bash for FTP accounts, and created a new user. When I attempt to SFTP using either the IP address or the domain name, this is the error FileZilla is giving me: Error: Authentication failed. Error: Critical error Error: Could not connect to server Here is some basic information regarding the server: Operating system Linux 2.6.24.5-20080421a Plesk Control Panel version psa v8.6.0_build86080930.03 os_CentOS 5 I had read in some places that I should reboot the SSH Service in Server - Services, however, there is no SSH Service within the list. I'm not really a server guy so it's quite possible I'm missing something obvious. Thanks for any help that you guys can provide!

    Read the article

  • Are there any viable DNS or LDAP alternatives for distributed key/value storage and retrieval?

    - by makerofthings7
    I'm working on a software app that needs distributed decentralized name resolution, and isn't bound to TCP/IP. Or more precisely, I need to store a "key" and look up it's value, and the key may be a string, a number, or any other realistic data type. Examples: With a phone number, look up a name. (or with an area code, redirect to the server that handles that exchange) With an IP Address get a DNS name, or a Whois contact (string value) With a string, get an IP, ( like a DNS TXT or SRV record). I'm thinking out of the box here and looking for any software that allows for this. (more info below) Are there any secure, scalable DNS alternatives that have gained notoriety? I could ask on StackOverflow, but think the infrastructure groups would have better insight on this. Edit More info: I'm looking at "Namecoin" the DNS version of Bitcoin, and since that project is faltering, I'm looking at alternative ways to store name-value pairs, with an optional qualifier. I think a name value pair is of global interest is useful, but on a limited scale. Namecoin tried to be too much, and ended up becoming nothing. I'm trying to solve that problem in researching alternatives and applying distributed technologies where applicable. Bitcoin/Namecoin offers a Distributed Hash Table, which has some positive aspects, but not useful for DNS, except for root servers.

    Read the article

< Previous Page | 607 608 609 610 611 612 613 614 615 616 617 618  | Next Page >