Search Results

Search found 30804 results on 1233 pages for 'hardware test'.

Page 375/1233 | < Previous Page | 371 372 373 374 375 376 377 378 379 380 381 382  | Next Page >

  • Freenas 8 email setup

    - by atrueresistance
    I'm struggling with setting up email reporting in Freenas. My build is FreeNAS-8.0.4-RELEASE-x64 (10351). I have my IPv4 Default gateway set to 192.168.2.1 (my router) and Nameserver 1 as 8.8.8.8 (google's public). Under my email tab I have from email ***@gmail.com outgoing mail server smtp.google.com port to connect to 465 tls/ssl SSL use smtp auth checked username ***@gmail.com password **** I then went into accounts and changed the root email to ***@gmail.com. When I try and send a test email, I get Your test email could not be sent: timed out So what am I doing wrong?

    Read the article

  • multi-dimension array problem in RGSS (RPG Maker XP)

    - by AzDesign
    This is my first day code script in RMXP. I read tutorials, ruby references, etc and I found myself stuck on a weird problem, here is the scenario: I made a custom script to display layered images Create the class, create an instance variable to hold the array, create a simple method to add an element into it, done The draw method (skipped the rest of the code to this part): def draw image = [] index = 0 for i in [email protected] if image.size > 0 index = image.size end image[index] = Sprite.new image[index].bitmap = RPG::Cache.picture(@components[i][0] + '.png') image[index].x = @x + @components[i][1] image[index].y = @y + @components[i][2] image[index].z = @z + @components[i][3] @test =+ 1 end end Create an event that does these script > $layerz = Layerz.new $layerz.configuration[0] = ['root',0,0,1] > $layerz.configuration[1] = ['bark',0,10,2] > $layerz.configuration[2] = ['branch',0,30,3] > $layerz.configuration[3] = ['leaves',0,60,4] $layerz.draw Run, trigger the event and the result : ERROR! Undefined method`[]' for nil:NilClass pointing at this line on draw method : image[index].bitmap = RPG::Cache.picture(@components[i][0] + '.png') THEN, I changed the method like these just for testing: def draw image = [] index = 0 for i in [email protected] if image.size > 0 index = image.size end image[index] = Sprite.new image[index].bitmap = RPG::Cache.picture(@components[0][0] + '.png') image[index].x = @x + @components[0][1] image[index].y = @y + @components[0][2] image[index].z = @z + @components[0][3] @test =+ 1 end I changed the @components[i][0] to @components[0][0] and IT WORKS, but only the root as it not iterates to the next array index Im stuck here, see : > in single level array, @components[0] and @components[i] has no problem > in multi-dimension array, @components[0][0] has no problem BUT > in multi-dimension array, @components[i][0] produce the error as above > mentioned. any suggestion to fix the error ? Or did I wrote something wrong ?

    Read the article

  • MS Project - Schedule short duration tasks that stay within working hrs

    - by Dave Warwick
    I am planning a series of tests that take a couple of hours each. However, you can not split a test so I do not want the next test to begin if there is not enough time within the specified working hours of the day to complete it. Also, I would like to begin each day with a set-up period before the actual testing can begin. Is there a way to automatically begin each day with a setup period and have tasks that can not complete before the end of the specified work day defer starting until the next day?

    Read the article

  • VMware Virtualisation - Convert 64bit Windows Server to 32bit VM?

    - by dannymcc
    I have just started playing around with Vmware sphere and have the hypervisor OS installed on a spare HP ProLiant DL360 G4. I have created a test virtual machine running Ubuntu which has worked well. As a test project I wanted to convert a powered on server running Windows Server 2008 64bit into a virtual machine. As soon as I ran the Vmware Go software to start the conversion it became apparent that I cannot run 64bit guest OS's on that particular server. So, is there a way of migrating 64bit to 32bit during the conversion?

    Read the article

  • Is there a remote file transfer command that preserves nanosecond timestamps?

    - by Denver Gingerich
    I've tried transferring files using scp and rsync on Ubuntu 10.04, but neither of them preserves more than second precision. Here's an example: $ touch test1 $ scp -p test1 localhost:test2 $ ls -l --full-time test* -rw-r--r-- 1 user user 0 2011-01-14 18:46:06.579717282 -0500 test1 -rw-r--r-- 1 user user 0 2011-01-14 18:46:06.000000000 -0500 test2 $ cp -p test1 test2 $ ls -l --full-time test* -rw-r--r-- 1 user user 0 2011-01-14 18:46:06.579717282 -0500 test1 -rw-r--r-- 1 user user 0 2011-01-14 18:46:06.579717282 -0500 test2 $ A straight copy works fine, but scp truncates the timestamp. Are there any tools (preferably similar to scp or rsync in their usage) that do remote file transfers while preserving nanosecond timestamps? I could write a hacky script to do it, but I'd rather not.

    Read the article

  • after new install 12.04 black screen with blinking cursor

    - by gregor
    I installed 12.04 and I got after boot black screen.Next I started in filesafe mode but filesafeX (all options)do not work(black screen.Then I started root shell , remounted rw HD and on prompt built xorg.conf with: $X -configure I have edited xorg.conf to delete obsolete monitors and screens. After reboot I get black screen with blinking cursor (no terminals) what can I do? how to edit xorg.conf when this could be a problem?## Heading ## hardware:radeon hd5470 and i3 with i915

    Read the article

  • U Central Florida Streamlines Administrative Apps

    - by jay.richey
    The University of Central Florida is wrapping up a multi-year implementation of new enterprise applications that includes a combination of Oracle software and Sun hardware to streamline its administrative processes and help manage student growth. The Orlando-based university currently has more than 56,000 students, making it the second largest American university by enrollment, behind only Arizona State University, which has more than 70,400 students. Read more...

    Read the article

  • 4096 and 8192 block size read slower than write? by using lsi 9361-8i RAID10

    - by Min Hong Tan
    is it possible that 1024 and 2048 block size read speed is faster than 4096 and 8192 block? I'm using lsi 9361-8i with RAID 10 , with 8 x Kingston E50 250G. result: 1024 = Write: 2,251 MB/s Read: 2,625 MB/s 2048 = Write: 2,141 MB/s Read: 3,672 MB/s 4096 = Write: 2,147 MB/s Read: 231 MB/s 8192 = Write: 2,147 MB/s Read: 442 MB/s is there any possible? and below is the reading when i simply want to test out the RAID 10 function and disaster test by taking out one of the 250G harddisk. the result is different like below: Result: 1024 = Write: 825 MB/s Read: 1,139 MB/s 2048 = Write: 797 MB/s Read: 1,312 MB/s 4096 = Write: 911 MB/s Read: 1,342 MB/s 8192 = Write: 786 MB/s Read: 1,204 MB/s and the result for 4096 and 8192block are different? can any one explain to me is it normal? or I need to do some tuning/configuration? will it affect my host linux performance?

    Read the article

  • Can't connect to Windows Server 2008 shared folders via VPN connection

    - by Pearl
    I set up an VPN connection on my 2008 server using RRAS. The VPN seems to work fine. I can connect from outside the network. I am also able to establish a remote access connection via the VPN-IP. However, I can't access my shared folders. After connecting to the VPN I can ping the server, but it is not shown in my networks. using \ip or \server-name doesn't work either, cannot be found. I checked ipconfig and this is what I found regarding the VPN: DNS-Suffix: Description: test Physical Adress: DHCP activated: No Auto-Config: Yes IPv4-Adress: 192.168.2.114 Sub: 255.255.255.255 Standard-gateway: DNS-Server: 192.168.0.1 NetBIOS: activated To clarify my IP-situation: server is connected to router with 192.168.0.x, the test-client is in an external network connected to a router with 192.168.1.x, server-client connection is using static ips with 192.168.2.x Can anyone help me with this one? The VPN should be ok since I am able to establish remote access.

    Read the article

  • Instant database snapshot

    - by raj
    My product uses oracle 9 database in its backend. every week the new release of the product is launched which will want to fire some DML, DDL queries to the database. I usually test the product release in a dummy database before applying it in the main database. I create a database dump using exp command, then import them into dummy database using imp. then i test the product in the dummy database and checks if there are any errors. This exp and imp takes about 3 hours to complete. Is there any alternative as : instant snapshot of the live database (which will be independent of the live one)? or is there any option to keep dummydatabase in sync with the originl database always. Yhis can be done by making the product firing DML&DDL queries to both the databases.. but this will be a HUGE performance problem.. how can i overcome this?

    Read the article

  • Mobile Linux Gets Support From Chip Vendors

    <b>Hardware Central:</b> "The development of Linux on mobile devices may be poised to get a boost thanks to the formation of a new industry group called Linaro, backed by a consortium of chip vendors including ARM, Freescale, Texas Instruments, Samsung and ST-Ericsson."

    Read the article

  • SQL SERVER – Core Concepts – Elasticity, Scalability and ACID Properties – Exploring NuoDB an Elastically Scalable Database System

    - by pinaldave
    I have been recently exploring Elasticity and Scalability attributes of databases. You can see that in my earlier blog posts about NuoDB where I wanted to look at Elasticity and Scalability concepts. The concepts are very interesting, and intriguing as well. I have discussed these concepts with my friend Joyti M and together we have come up with this interesting read. The goal of this article is to answer following simple questions What is Elasticity? What is Scalability? How ACID properties vary from NOSQL Concepts? What are the prevailing problems in the current database system architectures? Why is NuoDB  an innovative and welcome change in database paradigm? Elasticity This word’s original form is used in many different ways and honestly it does do a decent job in holding things together over the years as a person grows and contracts. Within the tech world, and specifically related to software systems (database, application servers), it has come to mean a few things - allow stretching of resources without reaching the breaking point (on demand). What are resources in this context? Resources are the usual suspects – RAM/CPU/IO/Bandwidth in the form of a container (a process or bunch of processes combined as modules). When it is about increasing resources the simplest idea which comes to mind is the addition of another container. Another container means adding a brand new physical node. When it is about adding a new node there are two questions which comes to mind. 1) Can we add another node to our software system? 2) If yes, does adding new node cause downtime for the system? Let us assume we have added new node, let us see what the new needs of the system are when a new node is added. Balancing incoming requests to multiple nodes Synchronization of a shared state across multiple nodes Identification of “downstate” and resolution action to bring it to “upstate” Well, adding a new node has its advantages as well. Here are few of the positive points Throughput can increase nearly horizontally across the node throughout the system Response times of application will increase as in-between layer interactions will be improved Now, Let us put the above concepts in the perspective of a Database. When we mention the term “running out of resources” or “application is bound to resources” the resources can be CPU, Memory or Bandwidth. The regular approach to “gain scalability” in the database is to look around for bottlenecks and increase the bottlenecked resource. When we have memory as a bottleneck we look at the data buffers, locks, query plans or indexes. After a point even this is not enough as there needs to be an efficient way of managing such large workload on a “single machine” across memory and CPU bound (right kind of scheduling)  workload. We next move on to either read/write separation of the workload or functionality-based sharing so that we still have control of the individual. But this requires lots of planning and change in client systems in terms of knowing where to go/update/read and for reporting applications to “aggregate the data” in an intelligent way. What we ideally need is an intelligent layer which allows us to do these things without us getting into managing, monitoring and distributing the workload. Scalability In the context of database/applications, scalability means three main things Ability to handle normal loads without pressure E.g. X users at the Y utilization of resources (CPU, Memory, Bandwidth) on the Z kind of hardware (4 processor, 32 GB machine with 15000 RPM SATA drives and 1 GHz Network switch) with T throughput Ability to scale up to expected peak load which is greater than normal load with acceptable response times Ability to provide acceptable response times across the system E.g. Response time in S milliseconds (or agreed upon unit of measure) – 90% of the time The Issue – Need of Scale In normal cases one can plan for the load testing to test out normal, peak, and stress scenarios to ensure specific hardware meets the needs. With help from Hardware and Software partners and best practices, bottlenecks can be identified and requisite resources added to the system. Unfortunately this vertical scale is expensive and difficult to achieve and most of the operational people need the ability to scale horizontally. This helps in getting better throughput as there are physical limits in terms of adding resources (Memory, CPU, Bandwidth and Storage) indefinitely. Today we have different options to achieve scalability: Read & Write Separation The idea here is to do actual writes to one store and configure slaves receiving the latest data with acceptable delays. Slaves can be used for balancing out reads. We can also explore functional separation or sharing as well. We can separate data operations by a specific identifier (e.g. region, year, month) and consolidate it for reporting purposes. For functional separation the major disadvantage is when schema changes or workload pattern changes. As the requirement grows one still needs to deal with scale need in manual ways by providing an abstraction in the middle tier code. Using NOSQL solutions The idea is to flatten out the structures in general to keep all values which are retrieved together at the same store and provide flexible schema. The issue with the stores is that they are compromising on mostly consistency (no ACID guarantees) and one has to use NON-SQL dialect to work with the store. The other major issue is about education with NOSQL solutions. Would one really want to make these compromises on the ability to connect and retrieve in simple SQL manner and learn other skill sets? Or for that matter give up on ACID guarantee and start dealing with consistency issues? Hybrid Deployment – Mac, Linux, Cloud, and Windows One of the challenges today that we see across On-premise vs Cloud infrastructure is a difference in abilities. Take for example SQL Azure – it is wonderful in its concepts of throttling (as it is shared deployment) of resources and ability to scale using federation. However, the same abilities are not available on premise. This is not a mistake, mind you – but a compromise of the sweet spot of workloads, customer requirements and operational SLAs which can be supported by the team. In today’s world it is imperative that databases are available across operating systems – which are a commodity and used by developers of all hues. An Ideal Database Ability List A system which allows a linear scale of the system (increase in throughput with reasonable response time) with the addition of resources A system which does not compromise on the ACID guarantees and require developers to learn new paradigms A system which does not force fit a new way interacting with database by learning Non-SQL dialect A system which does not force fit its mechanisms for providing availability across its various modules. Well NuoDB is the first database which has all of the above abilities and much more. In future articles I will cover my hands-on experience with it. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: NuoDB

    Read the article

  • PASS 13 Dispatches: moving to the cloud

    - by Tony Davis
    PASS Summit 13, Day 1 keynote by Quentin Clarke and we're hearing about “redefiniing mission critical in the cloud”. With a move to the Windows Azure cloud comes the promise of capacity on demand, automatic HA, backups, patching and so on, as well as passing responsibility to MS for managing hardware, upgrades and so on. However, for many databases and applications the best route to the cloud is not necessarily obvious. For most, the path of least resistance is IaaS – SQL Server in a Azure VM. It removes the hardware burden but you still have to manage your databases and implementing HA for SQL Server is your responsibility. Also, scaling up comes at quite a cost – the biggest VM (8 CPU cores, 56 GB RAM, 16 1TB drives with 500 IOPS each) weighs in at over over $4500 per month. With PaaS, in the form of Windows SQL Database, you get a “3-copies replica set” so HA comes out-of the box, and removes the majority of the administration burden, but you are moving your database into a very different environment. For a start, it's a shared environment, with other customers using the same compute nodes in the cluster, and potentially even sharing the same database (multi-tenancy). Unless you pay for SQL DB Premium edition, the resources available for your workload will depends on how nicely others “play” in the shared environment. You'll potentially need to do a lot of tuning, and application rewriting to avoid throttling issues, optimising application-database communication to deal with increased latency between the two, and so on. You'll need aggressive application caching. You'll also need retry logic and to deal with (expected) node failure and the need to reconnect. In Tuesday's PASS Summit pre-con from the SQLCAT team, they spent a lot of time covering some of the telemetric techniques (collect into Azure storage the necessary monitoring data) to perform capacity planning, work out the hotspots and bottlenecks in your cloud applications. Tools like WAD (Windows Azure Diagnostics), performance counters SQL Database DMVs, and others, will be essential. Of course, to truly exploit the vast horizontal scaling that is available from the existence of thousands of compute nodes, you'll also need to need to consider how to “shard” your data so Azure can move it between nodes at will. Finding the right path to the Cloud isn't easy, but it's coming. I spoke to people one year ago who saw no real benefit in trying to move their infrastructure and databases to the cloud, but now at their company, it's the conversation that won't go away. Tony.  

    Read the article

  • ibm 8305-29u Sound comes from internal speaker instead of green port - Ubuntu 10.04 LTS

    - by TecBrat
    The title pretty much says it all. My experience w/ubuntu is VERY limited. I did find the system test feature and ran an audio test. It found my device as Intel 82801DB-ICH4. I wasn't getting anything. I tried a couple of things and don't really even remember what I tried, but I started getting audio (not just beeps) from the system's internal speaker. Any ideas? (I can't rule out a hardware problem. This box sat under my bed for a couple of years and it's first life was in an environment FULL of cement dust.)H

    Read the article

  • Has Little Endian won?

    - by espertus
    When teaching recently about the Big vs. Little Endian battle, a student asked whether it had been settled, and I realized I didn't know. Looking at the Wikipedia article, it seems that the most popular current OS/architecture pairs use Little Endian but that Internet Protocol specifies Big Endian for transferring numeric values in packet headers. Would that be a good summary of the current status? Do current network cards or CPUs provide hardware support for switching byte order?

    Read the article

  • Apache w/out internet connection

    - by robert knobulous
    I have a Vista laptop that I have been running Apache / MySql / Php / PhpMyAdmin on for quite some time without fail. I just use it to test bits of code here and there etc. No problems, until recently when I needed to test something and I happened to be in a place that I could not get an internet connection. Why am I unable to access localhost from the same machine without an internet connection? I am type http://localhost..etc into the browser's address bar and I get the message that I am unable to access without an internet connection. I checked my windows/system32/etc/hosts file and the first two lines are 127.0.0.1 localhost ::1 localhost What am I missing here?

    Read the article

  • Troubleshooting VC++ DLL in VB.Net

    - by Jolyon
    I'm trying to make a solution in Visual Studio that consists of a VC++ DLL and a VB.Net application. To figure this out, I created a VC++ Class Library project, with the following code (I removed all the junk the wizard creates): mathfuncs.cpp: #include "MathFuncs.h" namespace MathFuncs { double MyMathFuncs::Add(double a, double b) { return a + b; } } mathfuncs.h: using namespace System; namespace MathFuncs { public ref class MyMathFuncs { public: static double Add(double a, double b); }; } This compiles quite happily. I can then add a VC++ console project to the solution, add a reference to the original project for this new project, and call it as follows: test.cpp: using namespace System; int main(array<System::String ^> ^args) { double a = 7.4; int b = 99; Console::WriteLine("a + b = {0}", MathFuncs::MyMathFuncs::Add(a, b)); return 0; } This works just fine, and will build to test.exe and mathsfuncs.dll. However, I want to use a VB.Net project to call the DLL. To do this, I add a VB.Net project to the solution, make it the startup project, and add a reference to the original project. Then, I attempt to use it as follows: MsgBox(MathFuncs.MyMathFuncs.Add(1, 2)) However, when I run this code, it gives me an error: "Could not load file or assembly 'MathFuncsAssembly, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null' or one of its dependencies. An attempt was made to load a program with an incorrect format." Do I need to expose the method somehow? I'm using Visual Studio 2008 Professional.

    Read the article

  • maildir in Windows for IMAP

    - by User1
    I'm interested in accessing my IMAP accounts offline. I found that maildirs are a simple way to make it work. I found that [offlineimap] takes care of almost everything in making the IMAP-maildir sync happen. Then, I can open the account in Mutt or Wanderlust client. One major problem, maildirs use colons in their filenames. Windows doesn't allow colons. I tried mount -f -s -b -o managed "d:/tmp/mail" "/home/of/mail" in Cygwin, but doing an echo test > /home/of/mail/test:file didn't work I'm thinking about ext2fs, but I need an ext2 partition somewhere. Can I make a file into a partition somehow? I don't want to start modifying my hard drive's partition table. Besides, does anyone know if ext2fs will support colons in filenames?

    Read the article

  • SQL Server Capacity Planner

    - by Colt
    Apart from the capacity planner tool for System Center and SharePoint Server, I was looking for a tool which can help me to estimate the capacity of SQL Server. I found an article on Microsoft.com for SQL Server 2000 sizing but unfortunately the links are obseleted and dead: Dell PowerMatch Server Sizing Software Compaq Active Answer Resources Finally I found an article that is "close" to my interest: Hardware and Software Requirements for Installing SQL Server 2008 If any of you heard of any tools in capacity planning or sizing for SQL Server, please drop me a message. Thanks,Colt

    Read the article

  • Check_webinject plugin will not connect to https site using

    - by uSlackr
    We're using Nagios to monitor some of our web sites. We have a script that uses the older plugin that we are trying to switch to using webinject.pl from cpan. When the script runs, it generates this error: LWP::Protocol::https::Socket: SSL connect attempt failed with unknown error error:1407741A:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert decode error at /usr/local/share/perl5/LWP/Protocol/http.pm line 51. It appears the web site does not support TLSv1 for https. If it matters, the site is a Cisco WebVPN. I've pointed the same script at a different site that does support TLSv1 and it seems to work fine. My web search is coming up empty. Successful connect: <case id="1" description1="Metro Home Page" description2="Metro, login test" method="get" url="https://metro.myco.com/index.php" verifypositive="restricted" logrequest="yes" logresponse="yes" sleep="1" / Failing connect: <case id="2" description1="WebVPN Home Page" description2="webvpn.myco.com login test" method="get" url="https://webvpn.myco.com/webvpn.html" verifypositive="Authorized" logrequest="yes" logresponse="yes" sleep="1" /

    Read the article

  • How to fix v4l2 Input/output error, vostro 1510 ubuntu 13.04 64bits?

    - by Fabio C. Barrionuevo da Luz
    I clean install upgrade to ubuntu 13.04 64btis, but the cheese and simplecv are no longer functioning properly. In previous versions of Ubuntu, everything worked normally. By running the two programs, I get the following message: libv4l2: error turning on stream: Input/output error ps: sorry, my English is very ugly. full hardware description of my notebook on this link: https://gist.github.com/luzfcb/5873728

    Read the article

  • Solution For Printer Problems

    The Most Common Fixes For Printer Problems A Printer is a hardware device that gives an impression in a text or graphic format. In general, the printer is not part of your computer system; you have ... [Author: Kaisar Adnan - Computers and Internet - April 12, 2010]

    Read the article

< Previous Page | 371 372 373 374 375 376 377 378 379 380 381 382  | Next Page >