Search Results

Search found 87589 results on 3504 pages for 'veritas cluster server'.

Page 496/3504 | < Previous Page | 492 493 494 495 496 497 498 499 500 501 502 503  | Next Page >

  • Traffic estimation for a multiplayer flash game

    - by Steve Addington
    hey, i want to know if my rough traffic estimations are right, it would be for a pretty simple realtime flashgame in the style of haxball (but not as a soccer game) heres a video of it http://www.youtube.com/watch?v=z_xBdFg1RcI So here comes my estimation, i dont know if they are realistic! i hope someone can help me. consider the packet attached as a typical one sent every 200ms, its 148bytes + 64 bytes of header will make around a 200bytes packet. The server will receive 200bytes x 6 players x 5 times a sec=6000bytes/s=5.85Kbytes/s=46.9kbit/s plus he has to send all back to the players, so at this point are 94Kbit/s.The server received all the information, perform the definitive calculation and send the new position to all players, in a bigger packet of around 900bytes that have to be delivered to the others 6, which makes 900bytes x 6 players x 5 times a sec=27000bytes/s=26Kbytes/s=210kbit/s. overall that would be 26kbyte per second. thats like 130mb traffic per hour for a 6player room. but somehow i think the numbers are too high? that would be really much traffic for such a simple game. did i calculate something wrong?

    Read the article

  • SQL – Download NuoDB and Qualify for FREE Amazon Gift Cards

    - by Pinal Dave
    July has been a fantastic month and Team NuoDB has really appreciated the active participation of the SQLAuthority.com active reader base. Earlier we had launched two contests with NuoDB and both of them are very much appreciated by readers. There are constant demands of more contests and team NuoDB is very much excited to support more contests. Here are the details to constests ran earlier: What ACID stands in the Database? – Contest to Win 24 Amazon Gift Cards and Joes 2 Pros 2012 Kit What is the latest Version of NuoDB? – A Quick Contest to Get Amazon Gift Cards Based on the earlier successful contests, the kind folks at NuoDB decided that they will support one more round of the giveaways to SQLAuthority.com contests. However, please note that this month’s contest will end in next 48 hours. You have to take part before July 31st, 2013 11:59:00 PM PST. Here is the quick contest: You just have to go and download NuoDB. The first 10 people who will download the NuoDB will get Amzon USD 10 cards. Remaining everyone will be entered into a lucky draw of Amazon Gift cards of USD 50. Winners will be announced in next 24 hours. To eligible for this contest, please download NuoDB before July 31st, 2013 11:59:00 PM PST. Bonus Round: If you have entered in the contest above, you can also enter to win latest Beginning SSRS Joes 2 Pros book. You just have to leave a comment over here with the note about how many different platform NuoDB supports. Here are few of the blog post I wrote earlier on that subject: Part 1 – Install NuoDB in 90 Seconds Part 2 – Manage NuoDB Installation Part 3 – Explore NuoDB Database Part 4 – Migrate from SQL Server to NuoDB Part 5 - NuoDB and Third Party Explorer – SQuirreL SQL Client, SQL Workbench/J and DbVisualizer Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Ubuntu and racadm

    - by lmqcn
    I recently purchased a used poweredge 1850 server and it came with a DRAC card. After wiping the HDD and installing ubuntu server 12.04.3 LTS amd64 on it, I am now trying to gain access to the DRAC which I believe is version 4. I have properly configured the DRAC to use it's own IP on my LAN and when I point my browser to the IP address, I am greeted with the DRAC login page (it has the dell logo and everything). However, after trying the credentials of root/calvin, I was denied access. So I think that the previous owners had set their own password. After doing some reading, it appears that I can reset the credentials to the default using racadm config -g cfgUserAdmin -o cfgUserAdminPassword -i 1 newpassword but upon entering the command, I get this error: bash: /usr/sbin/racadm: No such file or directory This holds true even if I run sudo su prior to running the racadm command. If, however, I run sudo racadm config -g cfgUserAdmin -o cfgUserAdminPassword -i 1 newpassword there are no errors. Yet, when I try to log into the DRAC via the web interface using the credentials of root/newpassword I am still not granted access. I installed the dell utilities via the guide at https://wiki.ubuntu.com/HardwareSupportMachinesServersDellNotes. I first tried to install the 64 bit version that is on the dell repositories, but after that was unsuccessful, I just followed the guide verbatim. No errors were produced in either case. I even followed the information at the bottom of the guide by executing sudo pppd /dev/ttyS1 1382400 crtscts noipdefault noauth lock persist connect 'chat -v "" CLIENT CLIENTSERVER "\\c"' but obviously, replacing the /dev/ttyS1 with the correct information for my system. ls -l /usr/sbin/ | grep racadm yields -rwxr-xr-x 1 root root 87930 Sep 16 04:03 racadm I have tried these credentials after each attempt of changing the password: root/calvin root/newpassword admin/calvin admin/newpassword All have been unsuccessful. What is the next course of action that I should take?

    Read the article

  • MySQL – Grouping by Multiple Columns to Single Column as A String

    - by Pinal Dave
    In this post titled SQL SERVER – Grouping by Multiple Columns to Single Column as A String we have seen how to group multiple column data in comma separate values in a single row grouping by another column by using FOR XML clause. In this post we will see how we can produce the same result using the GROUP_CONCAT function in MySQL. Let us create the following table and data. CREATE TABLE TestTable (ID INT, Col VARCHAR(4)); INSERT INTO TestTable (ID, Col) SELECT 1, 'A' UNION ALL SELECT 1, 'B' UNION ALL SELECT 1, 'C' UNION ALL SELECT 2, 'A' UNION ALL SELECT 2, 'B' UNION ALL SELECT 2, 'C' UNION ALL SELECT 2, 'D' UNION ALL SELECT 2, 'E'; Now to generate csv values of the column col for each ID, use the following code SELECT ID, GROUP_CONCAT(col) AS CSV FROM TestTable GROUP BY ID; The result is ID CSV 1 A,B,C 2 A,B,C,D,E You can also change the delimiters. For example instead of comma, if you want to have a pipe symbol (|), use the following SELECT ID, REPLACE(GROUP_CONCAT(col),',','|') AS CSV FROM TestTable GROUP BY ID; The result is ID CSV 1 A|B|C 2 A|B|C|D|E MySQL makes this very simple with its support of GROUP_CONCAT function. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • ecryptfs - decrypt and mount at boot with USB key

    - by Josh McGee
    I have a system running Ubuntu Server as a testbed for some services that I want to get familiar with. I decided to let the installation procedure set up encryption. I knew all along that I would have to decrypt it with the passphrase in order to get the system booted, but I assumed it wouldn't matter since it will only boot once or twice a month. However, my brother has informed me that he is a victim of power outages at the residence where this server is located. This means we have to explain to his girlfriend how to turn on the computer, attach a keyboard, connect a monitor (she just can't understand that she can type to the computer without a display, so whatever) and input the passphrase for us, while we are at work. I have arrived at the conclusion that I should just put together a USB key that can be plugged in before powering on the computer, to avoid all the trouble. Is this possible with ecryptfs? Is there a tutorial or simple list of instructions available so that I can knock this out and focus back on the stuff I care about? EDIT: I am aware that this is possible with LUKS and dm-crypt, but unfortunately the magical encryption that Ubuntu hands you during the installation is only ecryptfs so my question is specific to that.

    Read the article

  • Use Those Schemas, People!

    - by BuckWoody
    Database Schemas are just containers – they aren’t users or anything else – think of a sub-directory on the hard drive. In early versions of SQL Server we “hid” schemas, placing all objects under “dbo”, which gave the erroneous perception that Schemas are users. In SQL Server 2005, we “un-hid” or re-introduced schemas within the database. Users can have a default schema (a place where their new objects go), you can add new schemas and transfer objects between them, and they have many other benefits. But I still see a lot of applications, developed by shops I know as well as vendors, that don’t make use of a Schema. Everything is piled under dbo. I completely understand this – since permissions can be granted to a schema, they feel a lot like a user, so it’s just easier not to worry about both users and schemas when you create a database. But if you’ll use them properly you can make your application more understandable and portable. You should at least take a few minutes and read more about them – you owe it to your users: http://msdn.microsoft.com/en-us/library/ms190387.aspx Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Controlling server configurations with IPS

    - by barts
    I recently received a customer question regarding how they best could control which packages and which versions were used on their production Solaris 11 servers.  They had considered pointing each server at its own software repository - a common initial approach.  A simpler method leverages one of dependency mechanisms we introduced with Solaris 11, but is not immediately obvious to most people. Typically, most internal IT departments qualify particular versions for production use.  What this customer wanted to do was insure that their operations staff only installed internally qualified versions of Solaris on their servers.  The easiest way of doing this is to leverage the 'incorporate' type of dependency in a small package defined for each server type.  From the reference " Packaging and Delivering Software With the Image Packaging System in Oracle® Solaris 11.1":  The incorporate dependency specifies that if the given package is installed, it must be at the given version, to the given version accuracy. For example, if the dependent FMRI has a version of 1.4.3, then no version less than 1.4.3 or greater than or equal to 1.4.4 satisfies the dependency. Version 1.4.3.7 does satisfy this example dependency. The common way to use incorporate dependencies is to put many of them in the same package to define a surface in the package version space that is compatible. Packages that contain such sets of incorporate dependencies are often called incorporations. Incorporations are typically used to define sets of software packages that are built together and are not separately versioned. The incorporate dependency is heavily used in Oracle Solaris to ensurethat compatible versions of software are installed together. An example incorporate dependency is: depend type=incorporate fmri=pkg:/driver/network/ethernet/[email protected],5.11-0.175.0.0.0.2.1 So, to make sure only qualified versions are installed on a server, create a package that will be installed on the machines to be controlled.  This package will contain an incorporate dependency on the "entire" package, which controls the various components used to be build Solaris.  Every time a new version of Solaris has been qualified for production use, create a new version of this package specifying the new version of "entire" that was qualified.  Once this new control package is available in the repositories configured on the production server, the pkg update command will update that system to the specified version.  Unless a new version of the control package is made available, pkg update will report that no updates are available since no version of the control package can be installed that satisfies the incorporate constraint. Note that if desired, the same package can be used to specify which packages must be present on the system by adding either "require" or "group" dependencies; the latter permits removal of some of the packages, the former does not.  More details on this can be found in either the section 5 pkg man page or the previously mentioned reference document. This technique of using package dependencies to constrain system configuration leverages the SAT solver which is at the heart of IPS, and is basic to how we package Solaris itself.  

    Read the article

  • SQL – What is the latest Version of NuoDB? – A Quick Contest to Get Amazon Gift Cards

    - by Pinal Dave
    We had a great contest earlier last week - What ACID stands in the Database? – Contest to Win 24 Amazon Gift Cards and Joes 2 Pros 2012 Kit. It has received quite a few responses. Just like any other contest, not everyone was winner. The kind folks at NuoDB decided to give another chance to everyone who have not won in the last contest. This means if you have missed to take part in the earlier contest or if you have taken part and not won, you still have one more chance to win Amazon Gift Card. Here is the quick contest: You just have to go and download NuoDB. The first 10 people who will download the NuoDB will get 10 – USD 10 cards. Remaining everyone will be entered into a lucky draw of Amazon Gift cards of USD 50. Winners will be announced in next 24 hours. Bonus Round: If you have entered in the contest above, you can also enter to win latest Beginning SSRS Joes 2 Pros book. You just have to leave a comment over here with your experience about your experience with NuoDB and what is the latest version of the product. Here are few of the blog post I wrote earlier on that subject: Part 1 – Install NuoDB in 90 Seconds Part 2 – Manage NuoDB Installation Part 3 – Explore NuoDB Database Part 4 – Migrate from SQL Server to NuoDB Part 5 - NuoDB and Third Party Explorer – SQuirreL SQL Client, SQL Workbench/J and DbVisualizer Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • sp_ssiscatalog v1.0.1.0 now available for download

    - by jamiet
    13 days ago I wrote a blog post entitled Introducing sp_ssiscatalog (v1.0.0.0) in which I first made mention of sp_ssiscatalog, an open source stored procedure intended to make it easy to query the SSIS Catalog. I have been working on some enhancements since then and hence v1.0.1.0 is now available for download from Codeplex. What’s new in this release This release includes the following enhancements: [execution_id] now gets returned in a call to EXEC [dbo].[sp_ssiscatalog] @operation_type='exec'; Filter events by specifying packages to ignore EXEC [dbo].[sp_ssiscatalog] @operation_type='exec',@exec_events_packagesexcluded='SomePackage.dtsx,AnotherPackage.dtsx'; [event_message_id] is now returned in a list of events List of executions can now be filtered via a minimum and maximum execution_id EXEC [dbo].[sp_ssiscatalog] @operation_type='execs',@execs_minimum_execution_id=198,@execs_maximum_execution_id=201 Events resultsets now have a field, [event_message_context_xml] that contains an XML document containing all [event_message_context] info (if any exists) Installation instructions Download the zip file at DB v1.0.1.0. It contains two files, SsisReportingPack.dacpac & SSISDB.dacpac Unzip to a folder of your choosing Open a command prompt and change to the directory into which you unzipped the files Execute: "%PROGRAMFILES(x86)%\Microsoft SQL Server\110\DAC\bin\sqlpackage.exe" /a:Publish /tdn:SsisReportingPack /sf:SSISReportingPack.dacpac /v:SSISDB=SSISDB /tsn:(local) (/tsn specifies the target server. Change as appropriate.) If everything works OK you’ll see something like the following: or depending on whether the target database already exists or not This will create a database called [SsisReportingPack] which contains [dbo].[sp_ssiscatalog] Feedback is welcomed! @Jamiet

    Read the article

  • Static IP configuration causing apt-get errors

    - by JPbuntu
    I am getting errors when running apt-get update or when installing new packages. Although this only happens when the server configured for a static IP address. Changing the configuration back to DHCP and restarting networking fixes the problem, although I want a static IP. Once it is working I can change back to my static IP address and restart networking. Although this only works until I restart the server (restarting the router is ok), and then I start getting the same errors and have to switch back to DHCP. Any ideas on what could be causing this or tips on troubleshooting it? Thanks in advance. here is my static IP configuration: auto eth0 iface eth0 inet static address 192.168.2.2 netmask 255.255.255.0 gateway 192.168.2.1 The apt-get update errors go something like this: A few of these Ign http://us.archive.ubuntu.com precise-backports InRelease then a lot of these Err http://security.ubuntu.com precise-security Release.gpg Something wicked happened resolving 'security.ubuntu.com:http' (-5 - No address associated with hostname) and a lot of these W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backports/universe/i18n/Translation-en Something wicked happened resolving 'us.archive.ubuntu.com:http' (-5 - No address associated with hostname)

    Read the article

  • X server with nvidia driver crashing: 12.04

    - by Raster
    My X server consistently crashes. It seems like this is happening when the X server is idle. This behaviour is new with 12.04. This is only happening on the second display of a multiseat system. Is there a configuration change I can make to stop this? X.Org X Server 1.11.3 Release Date: 2011-12-16 X Protocol Version 11, Revision 0 Build Operating System: Linux 2.6.42-26-generic x86_64 Ubuntu Current Operating System: Linux Desktop 3.2.0-29-generic #46-Ubuntu SMP Fri Jul 27 17:03:23 UTC 2012 x86_64 Kernel command line: BOOT_IMAGE=/vmlinuz-3.2.0-29-generic root=/dev/mapper/Group1-Root ro ramdisk_size=512000 quiet splash vt.handoff=7 Build Date: 04 August 2012 01:51:23AM xorg-server 2:1.11.4-0ubuntu10.7 (For technical support please see http://www.ubuntu.com/support) Current version of pixman: 0.24.4 Before reporting problems, check http://wiki.x.org to make sure that you have the latest version. Markers: (--) probed, (**) from config file, (==) default setting, (++) from command line, (!!) notice, (II) informational, (WW) warning, (EE) error, (NI) not implemented, (??) unknown. (==) Log file: "/var/log/Xorg.1.log", Time: Sun Sep 2 22:37:06 2012 (==) Using config file: "/etc/X11/xorg.conf" (==) Using system config directory "/usr/share/X11/xorg.conf.d" Backtrace: 0: /usr/bin/X (xorg_backtrace+0x26) [0x7fcee86be846] 1: /usr/bin/X (0x7fcee8536000+0x18c6ea) [0x7fcee86c26ea] 2: /lib/x86_64-linux-gnu/libpthread.so.0 (0x7fcee785c000+0xfcb0) [0x7fcee786bcb0] 3: /usr/lib/x86_64-linux-gnu/xorg/extra-modules/nvidia_drv.so (0x7fcee14ba000+0x902a9) [0x7fcee154a2a9] 4: /usr/lib/x86_64-linux-gnu/xorg/extra-modules/nvidia_drv.so (0x7fcee14ba000+0xfd5e7) [0x7fcee15b75e7] 5: /usr/lib/x86_64-linux-gnu/xorg/extra-modules/nvidia_drv.so (0x7fcee14ba000+0x4d6b92) [0x7fcee1990b92] 6: /usr/lib/x86_64-linux-gnu/xorg/extra-modules/nvidia_drv.so (0x7fcee14ba000+0x4d74d5) [0x7fcee19914d5] 7: /usr/lib/x86_64-linux-gnu/xorg/extra-modules/nvidia_drv.so (0x7fcee14ba000+0x4d767d) [0x7fcee199167d] 8: /usr/bin/X (0x7fcee8536000+0x1196ec) [0x7fcee864f6ec] 9: /usr/bin/X (0x7fcee8536000+0xe8ad5) [0x7fcee861ead5] 10: /usr/bin/X (0x7fcee8536000+0xe9d45) [0x7fcee861fd45] /etc/X11/xorg.conf Section "ServerLayout" Identifier "Desktop" Screen 0 "DesktopScreen" 0 0 InputDevice "DesktopMouse" "CorePointer" InputDevice "DesktopKeyboard" "CoreKeyboard" Option "AutoAddDevices" "false" Option "AllowEmptyInput" "true" Option "AutoEnableDevices" "false" EndSection Section "ServerLayout" Identifier "Desktop2" Screen 1 "Desktop2Screen" 0 0 InputDevice "Desktop2Mouse" "CorePointer" InputDevice "Desktop2Keyboard" "CoreKeyboard" Option "AutoAddDevices" "false" Option "AllowEmptyInput" "true" Option "AutoEnableDevices" "false" EndSection Section "Module" Load "dbe" Load "extmod" Load "type1" Load "freetype" Load "glx" EndSection Section "Files" EndSection Section "ServerFlags" Option "AutoAddDevices" "false" Option "AutoEnableDevices" "false" Option "AllowMouseOpenFail" "on" Option "AllowEmptyInput" "on" Option "ZapWarning" "on" Option "HandleSepcialKeys" "off" # Zapping on Option "DRI2" "on" Option "Xinerama" "0" EndSection # Desktop Mouse Section "InputDevice" Identifier "DesktopMouse" Driver "evdev" Option "Device" "/dev/input/event3" Option "Protocol" "auto" Option "GrabDevice" "on" Option "Emulate3Buttons" "no" Option "Buttons" "5" Option "ZAxisMapping" "4 5" Option "SendCoreEvents" "true" EndSection # Desktop2 Mouse Section "InputDevice" Identifier "Desktop2Mouse" Driver "evdev" Option "Device" "/dev/input/event5" Option "Protocol" "auto" Option "GrabDevice" "on" Option "Emulate3Buttons" "no" Option "Buttons" "5" Option "ZAxisMapping" "4 5" Option "SendCoreEvents" "true" EndSection Section "InputDevice" Identifier "DesktopKeyboard" Driver "evdev" Option "Device" "/dev/input/event4" Option "XkbRules" "xorg" Option "XkbModel" "105" Option "XkbLayout" "us" Option "Protocol" "Standard" Option "GrabDevice" "on" EndSection Section "InputDevice" Identifier "Desktop2Keyboard" Driver "evdev" Option "Device" "/dev/input/event6" Option "XkbRules" "xorg" Option "XkbModel" "105" Option "XkbLayout" "us" Option "Protocol" "Standard" Option "GrabDevice" "on" EndSection Section "Monitor" Identifier "Desktop2Monitor" VendorName "Acer" ModelName "Acer G235H" HorizSync 30.0 - 83.0 VertRefresh 56.0 - 75.0 Option "DPMS" EndSection Section "Monitor" Identifier "DesktopMonitor" VendorName "Acer" ModelName "Acer H213H" HorizSync 30.0 - 83.0 VertRefresh 56.0 - 75.0 Option "DPMS" EndSection Section "Device" Identifier "EVGACard" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce GTX 560 Ti" Option "Coolbits" "1" BusID "PCI:2:0:0" Screen 0 EndSection Section "Device" Identifier "XFXCard" Driver "nvidia" VendorName "NVIDIA Corporation" BoardName "GeForce GTX 9800" Option "Coolbits" "1" BusID "PCI:5:0:0" Screen 0 EndSection Section "Screen" Identifier "DesktopScreen" Device "EVGACard" Monitor "DesktopMonitor" DefaultDepth 24 SubSection "Display" Depth 24 EndSubSection EndSection Section "Screen" Identifier "Desktop2Screen" Device "XFXCard" Monitor "Desktop2Monitor" DefaultDepth 24 SubSection "Display" Depth 24 EndSubSection EndSection

    Read the article

  • Is it necessary to add an HP printer to CUPS using the hplip URI?

    - by JPbuntu
    I recently setup a CUPS print server (Ubuntu server 12.04) and I having trouble with performance of a HP Color LaserJet Printer CP3505n. The printer pauses for about a second between printing each page, which is annoying when there is a lot of printing to be done. This doesn't happen when the printer is installed directly to a Windows client. In an attempt to fix this I have setup the printer a couple different ways. I decided not to do a Samba share since this wiki said IPP is preferred. First Method Added to HP LaserJet to CUPS as a Discovered Network Printer, and selected HP Color LaserJet cp3505 hpijs pcl3, 3.12.2 (en) driver. I did not use a hplip URI. Second Method (hplip URI) I thought adding hplip to the mix might improve the performance, so I added the printer like this: Ran hp-setup -m 192.168.2.60, prompted to select driver Selected HP Color LaserJet cp3505 hpijs pcl3, 3.12.2 (en) Used hplip to generate a URI: hp-makeuri 192.168.2.60 Then added the printer to CUPS as a Local Printer: HP Printer (HPLIP), and entered: hp:/net/HP_Color_LaserJet_CP3505?ip=192.168.2.60. Either method I use I am able to share the printer on the network by adding a printer as http://192.168.2.2:631/printers/HP_LASER-TERRAC. Does it make a difference which way the printer is added cups? If so, and I install the printer with the hp URI, can I still change the driver using the CUPS web interface? I have been trying out different drivers to try and improve performance, and the cups interface is the easiest way to change them. Thanks in advance.

    Read the article

  • Frequent disconnects using wlan AR9285

    - by John Neil
    I'm getting a large number of disconnects from my wireless when I switched to oneiric server (I did not see these happen with oneiric desktop) from my AR9285 wireless LAN device. Here is the syslog snippet: Oct 17 09:43:17 weather kernel: [ 1537.329138] wlan0: deauthenticated from 00:12:17:7a:8e:42 (Reason: 7) Oct 17 09:43:17 weather kernel: [ 1537.340409] cfg80211: All devices are disconnected, going to restore regulatory settings Oct 17 09:43:17 weather kernel: [ 1537.340423] cfg80211: Restoring regulatory settings Oct 17 09:43:17 weather kernel: [ 1537.340435] cfg80211: Calling CRDA to update world regulatory domain Oct 17 09:43:17 weather kernel: [ 1537.348571] cfg80211: Ignoring regulatory request Set by core since the driver uses its own custom regulatory domain Oct 17 09:43:17 weather kernel: [ 1537.348581] cfg80211: World regulatory domain updated: Oct 17 09:43:17 weather kernel: [ 1537.348586] cfg80211: (start_freq - end_freq @ bandwidth), (max_antenna_gain, max_eirp) Oct 17 09:43:17 weather kernel: [ 1537.348594] cfg80211: (2402000 KHz - 2472000 KHz @ 40000 KHz), (300 mBi, 2000 mBm) Oct 17 09:43:17 weather kernel: [ 1537.348600] cfg80211: (2457000 KHz - 2482000 KHz @ 20000 KHz), (300 mBi, 2000 mBm) Oct 17 09:43:17 weather kernel: [ 1537.348607] cfg80211: (2474000 KHz - 2494000 KHz @ 20000 KHz), (300 mBi, 2000 mBm) Oct 17 09:43:17 weather kernel: [ 1537.348613] cfg80211: (5170000 KHz - 5250000 KHz @ 40000 KHz), (300 mBi, 2000 mBm) Oct 17 09:43:17 weather kernel: [ 1537.348620] cfg80211: (5735000 KHz - 5835000 KHz @ 40000 KHz), (300 mBi, 2000 mBm) Here is the relevant lspci output: # lspci | grep Atheros 02:00.0 Network controller: Atheros Communications Inc. AR9285 Wireless Network Adapter (PCI-Express) (rev 01) I have done quite a bit of searching and saw discussions for previous versions of ubuntu that recommended installing the linux-backports-modules package. However, this does not appear to be available for oneiric (just the headers are listed as a package). Any advice on how to achieve a stable wireless connection for this server? It's location mitigates against using a wired connection.

    Read the article

  • apt-get update stuck on "Waiting for Headers"

    - by crasic
    I'm setting up a Maverick server on a spare PC. The install completes fine and the system boots up into the shell. However, when I try to do a apt-get update , apt hangs on almost every entry with the message 99% [Waiting for headers] sometimes a message of 96 b/s appears on the far right. The actual percent that it claims also varies. Searching around online gave a potential solution by using the option Acquire::http::Pipeline-Depth="0" this somewhat alleviates the problem, i.e. it stalls on every other entry with the same message as above. If you wait it out (the whole update took about 4 hours), the update still fails as a good portion of the hits show a "unable to connect" or similar message, despite the fact that I can ping the server from the pc just fine. The problem is also unrelated to the mirror used since I've tried about a dozen mirrors with no success, I've even tried commenting out everything but the main entry in sources.list and it still refuses to update. The network connection is fine since I can ping and wget (apt won't let me install lynx until I run a successful update) just fine. I've also reinstalled the distro with no luck. The only thing weird about the setup is that the PC is connecting to the internet through my windows laptop with ICS configured properly, but as I've said before, the network connection is fine.

    Read the article

  • How would I batch rename a lot of files using command-line?

    - by Whisperity
    I have a problem which I am unable to solve: I need to rename a great dump of files using patterns. I tried using this, but I always get an error. I have a folder, inside with a lot of files. Running ls -1 | wc -l, it returns that I have like 160000 files inside. The problem is, that I wish to move these files to a Windows system, but most of them have characters like : and ? in them, which makes the file unaccessible on said Windows-based systems. (As a "do not solve but deal with" method, I tried booting up a LiveCD on the Windows system and moving the files using the live OS. Under that Ubuntu, the files were readable and writable on the mounted NTFS partition, but when I booted back on Windows, it showed that the file is there but Windows was unable to access it in any fashion: rename, delete or open.) I tried running rename 's/\:/_' * inside the folder, but I got Argument list too long error. Some search revealed that it happens because I have so many files, and then I arrived here. The problem is that I don't know how to alter the command to suit my needs, as I always end up having various errors like Trying find -name '*:*' | xargs rename : _, it gives xargs: unmatched single quote; by default quotes are special to xargs unless you use the -0 option [\n] syntax error at (eval 1) line 1, near ":" [\n] xargs: rename: exited with status 255; aborting Adding the -0 after xargs turns the error message to xargs: argument line too long These files are archive files generated by various PHP scripts. The best solution would be having a chance to rename them before they are moved to Windows, but if there is no way to do it, we might have a way to rename the files while they are moved to Windows. I use samba and proftpd to move the files. Unfortunately, graphical software are out of the question as the server containing the files is what it is, a server, with only command-line interface.

    Read the article

  • FTP gives me a error when uploading and deleting files [on hold]

    - by AR Games
    Here's the error I get when trying to delete files... Command: DELE index.html Response: 550 Delete operation failed. Here's the error I get when trying to upload files... Command: OPTS UTF8 ON Response: 200 Always in UTF8 mode. Status: Connected Status: Starting upload of C:\wamp\www\.DS_Store Command: CWD /var/www/html Response: 250 Directory successfully changed. Command: TYPE A Response: 200 Switching to ASCII mode. Command: PASV Response: 227 Entering Passive Mode (76,185,76,101,78,222). Command: STOR .DS_Store Response: 553 Could not create file. Error: Critical file transfer error Status: Retrieving directory listing... Command: TYPE I Response: 200 Switching to Binary mode. Command: PASV Response: 227 Entering Passive Mode (76,185,76,101,23,94). Command: LIST Response: 150 Here comes the directory listing. Response: 226 Directory send OK. Status: Directory listing successful Response: 421 Timeout. Error: Connection closed by server Status: Disconnected from server IM running windows OS and using filezilla FTP client

    Read the article

  • MySQL Server 5.6 defaults changes

    - by user12626240
    We're improving the MySQL Server defaults, as announced by Tomas Ulin at MySQL Connect. Here's what we're changing:  Setting  Old  New  Notes back_log  50  50 + ( max_connections / 5 ) capped at 900 binlog_checksum  off  CRC32  New variable in 5.6 binlog_row_event_max_size  1k  8k flush_time  1800  Windows changes from 1800 to 0  Was already 0 on other platforms host_cache_size  128  128 + 1 for each of the first 500 max_connections + 1 for every 20 max_connections over 500, capped at 2000  New variable in 5.6 innodb_autoextend_increment  8  64  Now affects *.ibd files. 64 is 64 megabytes innodb_buffer_pool_instances  0  8. On 32 bit Windows only, if innodb_buffer_pool_size is greater than 1300M, default is innodb_buffer_pool_size / 128M innodb_concurrency_tickets  500  5000 innodb_file_per_table  off  on innodb_log_file_size  5M  48M  InnoDB will always change size to match my.cnf value. Also see innodb_log_compressed_pages and binlog_row_image innodb_old_blocks_time 0  1000 1 second innodb_open_files  300  300; if innodb_file_per_table is ON, higher of table_open_cache or 300 innodb_purge_batch_size  20  300 innodb_purge_threads  0  1 innodb_stats_on_metadata  on  off join_buffer_size 128k  256k max_allowed_packet  1M  4M max_connect_errors  10  100 open_files_limit  0  5000  See note 1 query_cache_size  0  1M query_cache_type  on/1  off/0 sort_buffer_size  2M  256k sql_mode  none  NO_ENGINE_SUBSTITUTION  See later post about default my.cnf for STRICT_TRANS_TABLES sync_master_info  0  10000  Recommend: master_info_repository=table sync_relay_log  0  10000 sync_relay_log_info  0  10000  Recommend: relay_log_info_repository=table. Also see Replication Relay and Status Logs table_definition_cache  400  400 + table_open_cache / 2, capped at 2000 table_open_cache  400  2000   Also see table_open_cache_instances thread_cache_size  0  8 + max_connections/100, capped at 100 Note 1: In 5.5 there was already a rule to make open_files_limit 10 + max_connections + table_cache_size * 2 if that was higher than the user-specified value. Now uses the higher of that and (5000 or what you specify). We are also adding a new default my.cnf file and guided instructions on the key settings to adjust. More on this in a later post. We're also providing a page with suggestions for settings to improve backwards compatibility. The old example files like my-huge.cnf are obsolete. Some of the improvements are present from 5.6.6 and the rest are coming. These are ideas, and until they are in an official GA release, they are subject to change. As part of this work I reviewed every old server setting plus many hundreds of emails of feedback and testing results from inside and outside Oracle's MySQL Support team and the many excellent blog entries and comments from others over the years, including from many MySQL Gurus out there, like Baron, Sheeri, Ronald, Schlomi, Giuseppe and Mark Callaghan. With these changes we're trying to make it easier to set up the server by adjusting only a few settings that will cause others to be set. This happens only at server startup and only applies to variables where you haven't set a value. You'll see a similar approach used for the Performance Schema. The Gurus don't need this but for many newcomers the defaults will be very useful. Possibly the most unusual change is the way we vary the setting for innodb_buffer_pool_instances for 32-bit Windows. This is because we've found that DLLs with specified load addresses often fragment the limited four gigabyte 32-bit address space and make it impossible to allocate more than about 1300 megabytes of contiguous address space for the InnoDB buffer pool. The smaller requests for many pools are more likely to succeed. If you change the value of innodb_log_file_size in my.cnf you will see a message like this in the error log file at the next restart, instead of the old error message: [Warning] InnoDB: Resizing redo log from 2*64 to 5*128 pages, LSN=5735153 One of the biggest challenges for the defaults is the millions of installations on a huge range of systems, from point of sale terminals and routers though shared hosting or end user systems and on to major servers with lots of CPU cores, hundreds of gigabytes of RAM and terabytes of fast disk space. Our past defaults were for the smaller systems and these change that to larger shared hosting or shared end user systems, still with a bias towards the smaller end. There is a bias in favour of OLTP workloads, so reporting systems may need more changes. Where there is a conflict between the best settings for benchmarks and normal use, we've favoured production, not benchmarks. We're very interested in your feedback, comments and suggestions.

    Read the article

  • How do I stop color changes when quitting vi from a terminal emulator?

    - by Michael Warhol
    I have a problem with colors when using vi under Ubuntu 12.04. I'm connecting to my Ubuntu server from a PC, using PowerTerm terminal emulation software. I have PowerTerm set up to display black text on a grey background. When I connect to the Ubuntu box, the screen is fine. When I open a file with vi, the screen is fine. The text is black on a gray background, which is normal for my PowerTerm setup. However, if the file is less than a full screen long, the remainder of the screen is a black background. When I quit vi, the entire background turns black, and the text becomes white. I have to do a Terminal Reset to restore my normal text and background colors. What I want is for there to be no change at all when I use vi. The text should be black and the background grey. I have another server loaded with RedHat 9, and that acts normally; colors don’t change when using vi. Here is my .vimrc file: set compatible syntax off let g:loaded_matchparen=1 set nocp set noincsearch set nohlsearch set noshowmatch set bg=dark I've tried set bg=dark and set bg=light. It makes no difference. Is there some other set command that would clear this up for me, or some TERM setting (my TERM is set to linux)?

    Read the article

  • Whats a good host for an active vBulletin site?

    - by Kyle
    I've been switching hosts using a VPS each time and I'm just really not sure I'm finding the right VPS's. I've used a VPS from burst.net & rubyringtech and I just feel like it's slowly killing my site because of the slow speed. I really don't know if it's the network or the VPS itself but I really wish to fix this. When I TOP into the VPS peak times it shows this: top - 03:18:56 up 16:33, 1 user, load average: 1.33, 1.40, 1.33 Tasks: 30 total, 1 running, 29 sleeping, 0 stopped, 0 zombie Cpu(s): 27.2%us, 13.6%sy, 0.0%ni, 59.2%id, 0.0%wa, 0.0%hi, 0.0%si, 0.0%st Mem: 1048576k total, 679712k used, 368864k free, 0k buffers Swap: 0k total, 0k used, 0k free, 0k cached And pages take atleast a good 2-3 minutes to load. I have only like 50-60 members on the forum also. I had a shared hosting account and the forum was lightning fast.... Is a VPS a bad idea? :\ What should I do to fix this? I'm running lighttpd with xcache, and the latest mysql + php version. The server is a intel i7 2600 w/ 1gb uplink (I think the 1gb uplink is a lie because I've tested the network and the highest download speed I've seen was 20mb/s from a code.google page) All in all I've seen people talking about linode. Should I try them? I honestly don't need a dedicated server yet it's only 50-70 members online. What should I do? I really want a VPS because I enjoy root access. Does anyone have any suggestions?

    Read the article

  • Process that needs a volume starting before volume mounts

    - by user36126
    The destination for incoming CrashPlan backups on my server (11.04) is /media/SeagateBig (SeagateBig is the volume name of my 2TB USB drive). When the server boots, two things happen: 1) SeagateBig auto-mounts and 2) CrashPlan starts. The problem is, that often these two things don't happen in that order. Then I get: Crashplan starts looks for /media/SeagateBig doesn't find it instead of waiting for it, CREATES IT Now it's backing up onto my / filesystem. NOT COOL. Meanwhile, when SeagateBig finally gets around to mounting, it finds that /media/SeagateBig already exists, shrugs, and creates /media/SeagateBig_ as its mount point. What I need is a way for the order to be enforced - where SeagateBig mounts and then and only then the CrashPlan service is started. Unless I learn that CrashPlan can be told to wait for its destination directory, never to create it... which I am also investigating. But the CrashPlanEngine script is installed by the product so I am loath to modify it, as I know I could by having it loop until df greps successfully for "SeagateBig".

    Read the article

  • What strategy should be employed to access Facebook data offline?

    - by user686021
    I'm working on a project similar to Klout which provides detail about how you influence other people and who influenced you. We'll be fetching data from few social networking sites (i.e linked in, facebook, twitter etc) to analyze how users interacts with one another. For that we need to parse the data and store it in db and have to analyze it so that strength of relation of two user can be decided. We'll be accessing data offline as well to provide them with accurate results. If we consider facebook activities, we need to have access to Facebook users' news feed, wall data which includes likes,comments,shares etc. To decide how one user influence other, we'll store all the data and analyze it. I need suggestions on what steps need to be taken for great performance. We'll be using ASP.Net(C#) Web forms, SQL Server, jQuery. Main concern is parsing of data, it's storage and retrieval with least overhead. For that I've summarized few points as below : Should we switch over to document-oriented database, like MongoDB or RavenDB for the whole app or part of it even though none of team member have experience with them? Should we use SQL Server Analysis service? Is there any other library than Json.NET for parsing data? Is it advisable to use any C# library over FQL + GET Request ? I've tried to provide as much info as possible. Please share your views for the same.

    Read the article

  • 4?????????????(Fusion Middleware??)

    - by rika.tokumichi
    ??????????OTN????????? ????????????????????????????????????????????????????? ???????????????????????????????? ????????????????????????????????????????????? ??????Fusion Middleware??4?1?~4?30?????????????! ??????????? 1?:Oracle WebLogic Server 11g?Download? 2?:Oracle JDeveloper 11g?Download? 3?:Oracle JRockit Mission Control 3.1.2?Download? 4?:Oracle SOA Suite 11g ?Download? 5?:Oracle JDeveloper 10g?Download? (????4?1?~4?30?) 1?~3??????????4??5??????????!??Oracle Fusion Middleware 11g R1 (11.1.1.3) ????? ??????????????????????????????? ??????????????????????????????????????? ?????????????????????????????????????????????????????????????????? >S/N Ratio Oracle Fusion Middleware 11g R1 (11.1.1.3) ???? (WebLogic?SOA?BPM?WebCenter?JDeveloper) ?????????OTN??????????????????????????? ??????7??????????Oracle Coherence?????????????????????? ???????????????????????????????????????????????????????????? Oracle Coherence ?????????????????? ??????????????????????????????????????????^^ OTN?????????? ??????????????????·???·???????????????????????????????????????? >Oracle Coherence???????????? ???? ????????????????? ???????????FX???????????????????FX??????????????????? >???FX?? - ?????????????????????(5/19) OTN?????????? 30??Oracle Coherence ???????????????????OracleFlash??????????pdf???????????? >Oracle Coherence ???? ?Flash?? >Oracle Coherence ???? ?pdf?? ????????????????????????Java?????????·?????????????? Oracle Coherence????????????????????????????????? ??? Oracle Coherence???????????????????????? ?????·??????????????? >?1?:?????????????!??????????????? >?2?:?????·?????????????????? >?3?:?????·??????????????:????? ?????? >Oracle Coherence?Oracle WebLogic Server:Eclipse??????? ??? >???? >?????? >???????????? ??????????????????????!

    Read the article

  • 2?????????????(Fusion Middleware??)

    - by rika.tokumichi
    ??????????OTN????????? ?????????????????????????????????????????????? ??????????????????????????????????????????????? ??????Fusion Middleware??2?1?~2?28?????????????! ??????????? 1?:Oracle WebLogic Server 11g?Download? 2?:Oracle JDeveloper 11g?Download? 3?:Oracle JRockit Mission Control 3.1.2?Download? 4?:Oracle SOA Suite 11g ?Download? 5?:Oracle WebLogic Server 10g?Download? (????2?1?~2?28?) 4??Oracle SOA Suite???????????!???????SOA??????????????????????????? OTN?????????? >2010?2??????????? SOA???????(?????????????)???PDF????Flash???????????????????? ???????: Oracle SOA Suite 11g ???? ??SOA????!!BPEL???? ??SOA??!! WEB???????? OTN ??? >SOA/BPM????? - ???????? - Web????????????SOA/BPM???????????????????????????????????????????????????????????????????????????????????????????????SOA/BPM???????????????? ???????? >Architect Center: SOA ??? ???????Java EE?SOA???????????????????????????? ????????????????????????????SOA?????????????????? ??????????SOA Suite 11g?SOA????????????????? >Java??????SOA (Oracle SOA Suite 11g R1 PS2???) >Oracle SOA Suite 11g R1 Patch Set 2 (11.1.1.3.0) ??? >(1) Oracle SOA Suite 11g R1?????????????????????? >>????SOA??????? ??????????????????????!

    Read the article

  • problems establishing ssh connection

    - by Superbyte
    since two days I am facing a really weird problem. I have receantly installed ubuntu server 14.04 LTS on a workstation. It has a fix IP address, which I can successfully ping from other computers in the network. But when I try to establish a ssh connection from a windows computer via putty I get some strange errors, which I cannot fix. The Problem is that putty takes a really long time trying to establish a connection. After about 10 seconds I get the following error: Network error: Software caused connection abort But when I click the Restart Session option a several times after putty shows the error message, I can login in. But now comes the other problem. When the login appears on the putty console I type in the user, but it really takes a long time until I can type in the password to login. This is what I already tried: sshd: ALL in etc/hosts.allow commented line session optional pam_motd.so in etc/pam.d/login and etc/pam.d/sshd configured the firewall with: sudo iptables -A INPUT -p tcp --dport ssh -j ACCEPT checked if ssh server is listening on port 22 UseDNS no in etc/ssh/sshd_config I hope someone can help me, because this problem is really annoying. Thanks in regard

    Read the article

  • Setting up a network between a host and guest virtual machine

    - by anonymous
    (I'm running ubuntu server 12.04 on virtual box) I'm trying to transfer a file (scp) from my laptop to one of the directories of a virtual machine. I tried sharing folders, but that failed. I'm a bit of a networking newbie. I've looked at like 20-30 pages. Here's one: http://www.howtoforge.com/moving-files-between-linux-systems-with-scp I followed those steps exactly. My problem is that when I try using scp, it just hangs. I'm also not sure which network interface to configure (eth0, eth1?) in the guest OS. Another (significant?) detail is that the inet address of eth0 is 10.0.2.15 instead of something like 192.168.x.y. I've enabled the bridge adapter and the host-only adapter. Both the laptop and guest VM have openssh-server installed. I'm not sure what to do at this point. Is there a better place to ask about this?

    Read the article

< Previous Page | 492 493 494 495 496 497 498 499 500 501 502 503  | Next Page >