Search Results

Search found 7533 results on 302 pages for 'knowledge transfer'.

Page 39/302 | < Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >

  • Slow Windows Explorer on Windows 7

    - by MadBoy
    I have Laptop with i7 (4 cores), 8GB ram and SSD OCZ Vertex 3 MaxIOPS which in testing that I did just now does 400mb/s+ read/write. However the responsiveness of Windows Explorer is far from being perfect. Opening up Computer, Documents, going into folders is very slow (1-5seconds). I don't have any viruses or spyware and I have tried changing properties to optimize view for General Items. I tried disabling Search Indexer but it made search in Outlook 2010 crawl and didn't bring any other effect. Even double clicking on file takes some time to open things up (like clicking a Word document). I don't have any drives mapped, my computer is not joined to domain. I have multiple VPN connections that I connect to but they all have disabled default gateways. I tried using CC Cleaner or some Windows 7 Tweaks app to disable some things. I am power user using Visual Studio, Tortoise SVN and other developer/administration apps. Any non obvious ideas? Edit: So I've been trying to pinpoint where the issue comes from and it seems that straight after reboot Windows Explorer opens very fast, when I load 3-4 programs (Royal TS, Visual Studio, Outlook) it's noticeably slower and the more programs I have it gets worse. After I start closing programs it starts working better and if I leave 2 open it's fast again. I tried doing some research with DiskMon and other programs from sysinternals but couldn't find anything suspicious. Below are stats during normal usage with a lots of programs open: - Ram usage with a lot of programs open and no swap file (i disabled it for testing): 6.95GB - CPU usage: 15%, none of the cores takes more then 50% (I have VS 2010 open x 4) HD Tune Pro: OCZ-VERTEX3 MI Benchmark Test capacity: full Read transfer rate Transfer Rate Minimum : 363.9 MB/s Transfer Rate Maximum : 505.5 MB/s Transfer Rate Average : Access Time : Burst Rate : CPU Usage : HD Tune Pro: OCZ-VERTEX3 MI File Benchmark Drive C: Transfer rate test File Size: 500 MB Sequential read 484102 KB/s Sequential write 444714 KB/s Random read 7779 IOPS Random write 16888 IOPS Random read (queue depth = 32) 73007 IOPS Random write (queue depth = 32) 69790 IOPS HD Tune Pro: OCZ-VERTEX3 MI Random Access Test capacity: full Read test Transfer size operations / sec avg. access time max. access time avg. speed 512 bytes 3260 IOPS 0.306 ms 2.106 ms 1.592 MB/s 4 KB 4161 IOPS 0.240 ms 2.006 ms 16.256 MB/s 64 KB 2382 IOPS 0.419 ms 2.367 ms 148.934 MB/s 1 MB 449 IOPS 2.225 ms 4.197 ms 449.407 MB/s Random 809 IOPS 1.235 ms 6.551 ms 410.527 MB/s HD Tune Pro: OCZ-VERTEX3 MI Extra Tests Test capacity: full Random seek 3975 IOPS 0.252 ms 1.941 MB/s Random seek 4 KB 4245 IOPS 0.236 ms 16.583 MB/s Butterfly seek 4086 IOPS 0.245 ms 1.995 MB/s Random seek / size 64 KB 3812 IOPS 0.262 ms 58.606 MB/s Random seek / size 8 MB 120 IOPS 8.348 ms 485.737 MB/s Sequential outer 4524 IOPS 0.221 ms 282.721 MB/s Sequential middle 4429 IOPS 0.226 ms 276.818 MB/s Sequential inner 5504 IOPS 0.182 ms 344.000 MB/s Burst rate 4472 IOPS 0.224 ms 279.475 MB/s

    Read the article

  • Migrating Outlook data with oracle connector for outlook

    - by amir shadaab
    I have a system which uses Oracle connector for MS outlook 2007. I recently bought a new system and I want to transfer al my email(the one that uses oracle connector) to another system with all the same settings. I know that during a normal transfer, I just need to transfer the .pst file and open it in another system. But I'm not sure how to go ahead with Oracle connector servers. Please help me out with this one.

    Read the article

  • How to get higher download rate than upload rate

    - by user23950
    i'm using bandwidthplace.com to measure my internet speed. And here it is: Download Speed: 422 kbps (52.8 KB/sec transfer rate) Upload Speed: 202 kbps (25.3 KB/sec transfer rate) Can you please explain to me how did they get the 52.8 transfer rate. And how do I get to lower the upload speed in order for me to get higher download speed.

    Read the article

  • Hidden DNS master only sending notify to one slave

    - by Rob
    My hidden DNS master is only sending notifies to one of the name servers for a zone I have 3 named servers ns0,ns1 & ns2 all running bind 9.7.3.dfsg-1ubuntu4.1. When an update is processed the master (ns0) seems to behave normally. ns0 (192.168.2.50) zone domain.org/IN: sending notifies (serial 2012060703) client 192.168.2.52#42892: transfer of 'domain.org/IN': AXFR-style IXFR started: TSIG rndc-key client 192.168.2.52#42892: transfer of 'domain.org/IN': AXFR-style IXFR ended ns2 (192.168.2.52) client 192.168.2.50#3762: received notify for zone 'domain.org': TSIG 'rndc-key' zone domain.org/IN: Transfer started. transfer of 'domain.org/IN' from 192.168.2.50#53: connected using 192.168.2.52#55747 zone domain.org/IN: transferred serial 2012060704: TSIG 'rndc-key' transfer of 'domain.org/IN' from 192.168.2.50#53: Transfer completed: 1 messages, 34 records, 1028 bytes, 0.001 secs (1028000 bytes/sec) Nothing happens on ns1. I've turned up the logging level but there's no information in syslog about the actual name servers bind has sent notifications to so I guess this is something it doesn't log. I've also tried watching tcpdump, it never makes any attempt to notify ns1 only ns2 192.168.2.50.56278 > 192.168.2.52.53: [udp sum ok] 56418 notify [b2&3=0x2400] [1a] [1au] ? SOA? domain.org. domain.org. [0s] SOA ns1.domain.net. dnsmaster.domain.net. ? 2012060801 10800 3600 604800 3600 ar: rndc-key. ANY [0s] TSIG hmac-md5.sig-alg.reg.int. fudge=300 maclen=16 origid=56418 error=0 otherlen=0 (174) the authoritive zone has both ns1 and ns2 records $ORIGIN domain.org. $TTL 3h @ IN SOA ns1.domain.net. dnsmaster.domain.net. ( 2012060801 ; Serial yyyymmddnn 3h ; Refresh After 3 hours 1h ; Retry Retry after 1 hour 1w ; Expire after 1 week 1h ) ; Minimum negative caching of 1 hour @ 3600 IN NS ns1.domain.net. @ 3600 IN NS ns2.domain.net. // Edit I have added also-notify {192.168.2.51;192.168.2.52;}; explicitly to the zone file and it all works fine, both ns1 and ns2 get notify messages and transfers succeed. I was under the impression bind would automatically send notifies to all NS records on a zone, maybe it's bugged?

    Read the article

  • come on!teach u save photos from iphone to computer

    - by goodm
    i am using iphone ,when we have fun,we took a lot of nice pics ,but,that is a question,how to transfer photos from iphone to computer,now ,let me show you,step by step: Step 1: Download Tansee iPhone Transfer Photo free trial version here,and then install it. You also need iTunes above 7.3 installed.or download at: http://www.softseeking.com/prodail.aspx?proid=74 Step 2: Connect iPhone to your computer. Step 3: Launch Tansee iPhone Transfer Photo and all the photos in your iPhone will display automatically, Step 4: Select the photos to be transferred to your computer, the selected file will marked with red border. You can select photos by click on each one, or just drag a rectangle to select a bundle of photos. You can also select all photos by click right button of your mouse or click "File" to choose. Note: you can only select first 6 photos if you haven't purchase. Step 5: Click "Copy" button to select output path and start to transfer photos to your computer: iPhone Camera Photo & Camera Video: Click "Camera Roll", do as steps above can copy your iPhone Camera Photos and iPhone Camera Videos to PC. Options Setting 1.Backup File Format: To select backup photo file format, Tansee iPhone Transfer Photo support BMP and JPG file now. 2.Backup Path: To select directory for storing the backup photos. You can select backup directory for each photo during backup by check "Ask Every Time" or store all files in a specified directory by checking "Save Here" and select the directory in the edit box. 3.Backup Resolution: To select the photo size to be backup.

    Read the article

  • WD Caviar Green Extremely Slow

    - by Steven
    I am encountering a really weird problem on my WD Caviar Green HDD. Well first of all I have 2 HDDs on my Desktop, one 160GB Seagate holding my Win7 Ultimate x64 and the problematic one, WD 1.5 Caviar Green for storage purpose. My problem is kinda weird, when I transfer files from my Seagate(C:) to my WD (D:) the speed is good (50-60MB/s). Then the problem arises when I transfer too "many" large files, the transfer speed would go straight down to kilobytes/s. Well after I cancelled the transfer and access my D:, even entering a folder requires loading for like 10 seconds. Such problem not only arises when I am transferring files to my D:, it seems like my WD can't handle much activities. For instance, last time I installed my game on D: and I would face much lag after playing for some time. When the same game is installed on C: no problem arises. Does anyone knows what is the problem? P/S: There was one temporary solution that I used to tried. After the "situation" occurs, I tried to access as many folders on D: as I can and let it load, repeating such actions and giving it some time bring the D: back to speedy transfer. However, large transfers would causes the situation to happen again. Does it have something to do with cache whatsoever?

    Read the article

  • Why is piping dd through gzip so much faster than a direct copy?

    - by Foo Bar
    I wanted to backup a path from a computer in my network to another computer in the same network over a 100MBit/s line. For this I did dd if=/local/path of=/remote/path/in/local/network/backup.img which gave me a very low network transfer speed of something about 50 to 100 kB/s, which would have taken forever. So I stopped it and decided to try gzipping it on the fly to make it much smaller so that the amount to transfer is less. So I did dd if=/local/folder | gzip > /remote/path/in/local/network/backup.img.gz But now I get something like 1 MB/s network transfer speed, so a factor of 10 to 20 faster. After noticing this, I tested this on several paths and files and it was always the same. Why does piping dd through gzip also increase the transfer rates by a large factor instead of only reducing the bytelength of the stream by a large factor? I'd expected even a small decrease in transfer rates instead, due to the higher CPU consumption while compressing, but now I get a double plus. Not that I'm not happy, but just wondering. ;)

    Read the article

  • Copied a file with winscp; only winscp can see it

    - by nilbus
    I recently copied a 25.5GB file from another machine using WinSCP. I copied it to C:\beth.tar.gz, and WinSCP can still see the file. However no other app (including Explorer) can see the file. What might cause this, and how can I fix it? The details that might or might not matter WinSCP shows the size of the file (C:\beth.tar.gz) correctly as 27,460,124,080 bytes, which matches the filesize on the remote host Neither explorer, cmd (command line prompt w/ dir C:\), the 7Zip archive program, nor any other File Open dialog can see the beth.tar.gz file under C:\ I have configured Explorer to show hidden files I can move the file to other directories using WinSCP If I try to move the file to Users/, UAC prompts me for administrative rights, which I grant, and I get this error: Could not find this item The item is no longer located in C:\ When I try to transfer the file back to the remote host in a new directory, the transfer starts successfully and transfers data The transfer had about 30 minutes remaining when I left it for the night The morning after the file transfer, I was greeted with a message saying that the connection to the server had been lost. I don't think this is relevant, since I did not tell it to disconnect after the file was done transferring, and it likely disconnected after the file transfer finished. I'm using an old version of WinSCP - v4.1.8 from 2008 I can view the file properties in WinSCP: Type of file: 7zip (.gz) Location: C:\ Attributes: none (Ready-only, Hidden, Archive, or Ready for indexing) Security: SYSTEM, my user, and Administrators group have full permissions - everything other than "special permissions" is checked under Allow for all 3 users/groups (my user, Administrators, SYSTEM) What's going on?!

    Read the article

  • Port-Forwarding in Virtual Box

    - by davidzaz
    I have Virtual Box setup with the following commands: vboxmanage setextradata myVm "VBoxInternal/Devices/pcnet/0/LUN#0/Config/transfer/HostPort" 50000 vboxmanage setextradata myVm `"VBoxInternal/Devices/pcnet/0/LUN#0/Config/transfer/GuestPort" 50000 vboxmanage setextradata myVm "VBoxInternal/Devices/pcnet/0/LUN#0/Config/transfer/Protocol" TCP On the host machine, the following command times out: telnet localhost 50000 What am I doing wrong? The above command does work on the guest machine.

    Read the article

  • Stream computer screen to TV via network instead of a USB wireless link

    - by user24559
    I want to stream my computer screen (not just video or a limited amount of content) to my TV via the network. I know there are wireless devices that use USB to tranfer the screen to the TV. However, these are limited to a short distance. What I want to do is stream the data via the network so I can be anywhere within the network and have the data shown on the tv. I am looking for video and sound to transfer. I want the entire computer screen to transfer just like when you connect the computer to the tv via VGA or HDMI and the sound out using the 3.5mm plug. I have been unable to find a unit that allows for the entire computer screen to transfer via the network. I just find the ability to stream video. I am using Windows 7 Ultimate with a quad processor and 16 GB of memory so I have the power to handle the transfer. My tv is hdtv.

    Read the article

  • Super slow Western Digial External hard drive

    - by shinokada
    I have 2TB of WD external hard drive. I use windows Vista 32x. on DELL laptop Latitude D630 and connect through USB cable. When I transfer from my C drive to external hard drive it transfer only 3KB/sec. It take 30min to transfer 6MB. It is useless at the moment. Can anyone help me how to speed up please. Thanks in advance.

    Read the article

  • Fraud Detection with the SQL Server Suite Part 2

    - by Dejan Sarka
    This is the second part of the fraud detection whitepaper. You can find the first part in my previous blog post about this topic. My Approach to Data Mining Projects It is impossible to evaluate the time and money needed for a complete fraud detection infrastructure in advance. Personally, I do not know the customer’s data in advance. I don’t know whether there is already an existing infrastructure, like a data warehouse, in place, or whether we would need to build one from scratch. Therefore, I always suggest to start with a proof-of-concept (POC) project. A POC takes something between 5 and 10 working days, and involves personnel from the customer’s site – either employees or outsourced consultants. The team should include a subject matter expert (SME) and at least one information technology (IT) expert. The SME must be familiar with both the domain in question as well as the meaning of data at hand, while the IT expert should be familiar with the structure of data, how to access it, and have some programming (preferably Transact-SQL) knowledge. With more than one IT expert the most time consuming work, namely data preparation and overview, can be completed sooner. I assume that the relevant data is already extracted and available at the very beginning of the POC project. If a customer wants to have their people involved in the project directly and requests the transfer of knowledge, the project begins with training. I strongly advise this approach as it offers the establishment of a common background for all people involved, the understanding of how the algorithms work and the understanding of how the results should be interpreted, a way of becoming familiar with the SQL Server suite, and more. Once the data has been extracted, the customer’s SME (i.e. the analyst), and the IT expert assigned to the project will learn how to prepare the data in an efficient manner. Together with me, knowledge and expertise allow us to focus immediately on the most interesting attributes and identify any additional, calculated, ones soon after. By employing our programming knowledge, we can, for example, prepare tens of derived variables, detect outliers, identify the relationships between pairs of input variables, and more, in only two or three days, depending on the quantity and the quality of input data. I favor the customer’s decision of assigning additional personnel to the project. For example, I actually prefer to work with two teams simultaneously. I demonstrate and explain the subject matter by applying techniques directly on the data managed by each team, and then both teams continue to work on the data overview and data preparation under our supervision. I explain to the teams what kind of results we expect, the reasons why they are needed, and how to achieve them. Afterwards we review and explain the results, and continue with new instructions, until we resolve all known problems. Simultaneously with the data preparation the data overview is performed. The logic behind this task is the same – again I show to the teams involved the expected results, how to achieve them and what they mean. This is also done in multiple cycles as is the case with data preparation, because, quite frankly, both tasks are completely interleaved. A specific objective of the data overview is of principal importance – it is represented by a simple star schema and a simple OLAP cube that will first of all simplify data discovery and interpretation of the results, and will also prove useful in the following tasks. The presence of the customer’s SME is the key to resolving possible issues with the actual meaning of the data. We can always replace the IT part of the team with another database developer; however, we cannot conduct this kind of a project without the customer’s SME. After the data preparation and when the data overview is available, we begin the scientific part of the project. I assist the team in developing a variety of models, and in interpreting the results. The results are presented graphically, in an intuitive way. While it is possible to interpret the results on the fly, a much more appropriate alternative is possible if the initial training was also performed, because it allows the customer’s personnel to interpret the results by themselves, with only some guidance from me. The models are evaluated immediately by using several different techniques. One of the techniques includes evaluation over time, where we use an OLAP cube. After evaluating the models, we select the most appropriate model to be deployed for a production test; this allows the team to understand the deployment process. There are many possibilities of deploying data mining models into production; at the POC stage, we select the one that can be completed quickly. Typically, this means that we add the mining model as an additional dimension to an existing DW or OLAP cube, or to the OLAP cube developed during the data overview phase. Finally, we spend some time presenting the results of the POC project to the stakeholders and managers. Even from a POC, the customer will receive lots of benefits, all at the sole risk of spending money and time for a single 5 to 10 day project: The customer learns the basic patterns of frauds and fraud detection The customer learns how to do the entire cycle with their own people, only relying on me for the most complex problems The customer’s analysts learn how to perform much more in-depth analyses than they ever thought possible The customer’s IT experts learn how to perform data extraction and preparation much more efficiently than they did before All of the attendees of this training learn how to use their own creativity to implement further improvements of the process and procedures, even after the solution has been deployed to production The POC output for a smaller company or for a subsidiary of a larger company can actually be considered a finished, production-ready solution It is possible to utilize the results of the POC project at subsidiary level, as a finished POC project for the entire enterprise Typically, the project results in several important “side effects” Improved data quality Improved employee job satisfaction, as they are able to proactively contribute to the central knowledge about fraud patterns in the organization Because eventually more minds get to be involved in the enterprise, the company should expect more and better fraud detection patterns After the POC project is completed as described above, the actual project would not need months of engagement from my side. This is possible due to our preference to transfer the knowledge onto the customer’s employees: typically, the customer will use the results of the POC project for some time, and only engage me again to complete the project, or to ask for additional expertise if the complexity of the problem increases significantly. I usually expect to perform the following tasks: Establish the final infrastructure to measure the efficiency of the deployed models Deploy the models in additional scenarios Through reports By including Data Mining Extensions (DMX) queries in OLTP applications to support real-time early warnings Include data mining models as dimensions in OLAP cubes, if this was not done already during the POC project Create smart ETL applications that divert suspicious data for immediate or later inspection I would also offer to investigate how the outcome could be transferred automatically to the central system; for instance, if the POC project was performed in a subsidiary whereas a central system is available as well Of course, for the actual project, I would repeat the data and model preparation as needed It is virtually impossible to tell in advance how much time the deployment would take, before we decide together with customer what exactly the deployment process should cover. Without considering the deployment part, and with the POC project conducted as suggested above (including the transfer of knowledge), the actual project should still only take additional 5 to 10 days. The approximate timeline for the POC project is, as follows: 1-2 days of training 2-3 days for data preparation and data overview 2 days for creating and evaluating the models 1 day for initial preparation of the continuous learning infrastructure 1 day for presentation of the results and discussion of further actions Quite frequently I receive the following question: are we going to find the best possible model during the POC project, or during the actual project? My answer is always quite simple: I do not know. Maybe, if we would spend just one hour more for data preparation, or create just one more model, we could get better patterns and predictions. However, we simply must stop somewhere, and the best possible way to do this, according to my experience, is to restrict the time spent on the project in advance, after an agreement with the customer. You must also never forget that, because we build the complete learning infrastructure and transfer the knowledge, the customer will be capable of doing further investigations independently and improve the models and predictions over time without the need for a constant engagement with me.

    Read the article

  • Rules for Naming

    - by PointsToShare
    © 2011 By: Dov Trietsch. All rights reserved Naming Documents (or is it “Document, Naming”?) Tis but thy name that is my enemy; Thou art thyself, though not a Montague. What's Montague? It is nor hand, nor foot, Nor arm, nor face, nor any other part Belonging to a man. O, be some other name! What's in a name? That which we call a rose By any other name would smell as sweet; So Romeo would, were he not Romeo call'd, Retain that dear perfection which he owes Without that title. Romeo, doff thy name And for that name which is no part of thee Take all myself.  Shakespeare – Romeo and Juliet Act II, Scene 2 We normally only use the bold portion of the famous Shakespearean quote above, but it is really out of context. As the play unfolds, we learn that a name is all too powerful. Indeed it is because of their names that the doomed lovers die. There might be life and death in a name (BTW, when I wrote this monogram, I was in Hatfield, PA. Remember the Hatfields and the McCoys?) This is a bit extreme, but in the field of Knowledge Management (KM) names are of the utmost importance as well. When I write an article about managing SharePoint sites, how should I name it? “Managing a site” or “Site, managing”? Nine times out of ten I’d opt for the latter. Almost everything we do is “Managing” so to make life easier for a person looking for meaningful content, we title our articles starting with the differentiator rather than the common factor. As a rule of thumb, we start the name with the noun rather than the verb. It is not what we do that is the primary key; it is what we do it to. So, answer this – is it a “rule of thumb” or a “thumb rule?” This is tough. A lot of what we do when naming is a judgment call. Both thumb and rule are nouns, albeit concrete and abstract (more about this later), but to most people “thumb rule” is meaningless while “rule of thumb” is an idiom. The difference between knowledge and information is that knowledge is meaningful information placed in context. Thus I elect the “rule of thumb”. It is the more meaningful title. Abstract and Concrete are relative terms. Many nouns (and verbs) that are abstract to a commoner, are concrete to a practitioner of one profession or another and may even have different concrete meanings in different professional jargons. Think about “running”. To an executive it means running a business, to a marathoner its meaning is much more literal. Generally speaking, we store and disseminate knowledge within a practice more than we do it in general. Even dictionaries encyclopedias define terms as they apply to different audiences. The rule of thumb is to put the more concrete first, but within the audience’s jargon. Even the title of this monogram is a question. Do I name it “Naming Documents” or “Documents, Naming”? Well, my own rule of thumb (“Here he goes again!?”) states that the latter is better because it starts with a noun, but this is a document about naming more than it about documents. The rules of naming also apply to graphs and charts, excel spreadsheets, and so on. Thus, I vote for the former.  A better title could have been “Naming Objects” only the word “Object” is a bit too abstract. How about just “Naming” or “Naming, rules of”? You get the drift. One of the ways to resolve all of this is to store the documents in Knowledge-Bases, which may become the subjects of a future punditry. Knowledge bases use keywords to describe their content.  Use a Metadata store for the keywords to at least attempt some common grounds. Here is another general rule (rule of thumb?!!) – put at least the one keyword in the title. Use subtitles. Here is an example: Migrating documents – Screening, cleaning, and organizing our knowledge. The main keyword is “documents”, next is “migrating”, other keywords also appear in the subtitle. They are “screening”, “cleaning”, and “organizing”. Any questions? Send me an amply named document by email: [email protected]

    Read the article

  • How TiVo is messing up customer support.

    - by James Fleming
    Ok,  So I've gotten a TiVo and overall, I'm happy, but there have been issues and I suspect I've a defective unit. - Now the nice folks after many service calls were happy to swap it out, and to ensure continuity of service, they sent me a new unit (after a $109 deposit).  That was yesterday. Today, when we go to watch a little TV, and wait for our replacement unit to arrive we find our TiVo service has been suspended. WTF? They have an exchange program, but your unit your waiting to exchange is as dead as a doornail until the replacement arrives. How hard is it to keep the old unit active for an extra week? Here is the exchange w/Tivo below... You are currently number 1 in the queue. We apologize for the delay. We will assign you to an agent as soon as one is available.The average amount of time a customer has to wait is 00:13.  Kaylene (Listening)  Kaylene: Thank you for contacting TiVo! My name is Kaylene. So that I may better assist you, are you an existing customer?  james Fleming: yes I am, but I'm now having second thoughts about being one    Kaylene: Thank you for verifying your information. How may I assist you today James?  james Fleming: I've been having issues w/a tivo box & I'm getting a replacement sent out to me (after paying an additional deposit) and now my current unit is no longer activated  Kaylene: I can help you today!  Kaylene: When we process an exchange we do transfer over the service to the replacement box so it is active and ready to go when you receive it.  james Fleming: which is to say you also make my current box worthless until such time I receive a new box?!?!?  Kaylene: I apologize that your original box was deactivated so we could activate your replacement box.  james Fleming: Why on Earth would I bother to pay in advance for a new box if you were going to kill my existing box.  Kaylene: What features are you needing to use on your current box?  james Fleming: I need to be able to access my netflix subscription (if I'm lucky enough to have it work without rebooting)  Kaylene: Can I have you verify the TiVo Service Number of your TiVo box please?  james Fleming: 7460011906979b4  Kaylene: We have your current box temporary service but not all features are available with temporary service as it is not paid for service.  Kaylene: If you like I can transfer your service back to your current box for now. Then once you receive the new box you will have to call in and have the service transferred back to the new box.  james Fleming: Not paid for? Let's see> one tivo box + 3 year service plan + monthly service + $109 deposit on a second box = what?  Kaylene: Would you like me to transfer your service back to your current box?  james Fleming: Yes - that would be helpful  Kaylene: All you will need to do is contact us again once you receive the new box so we can transfer it back.  Kaylene: I have put your service back on TiVo box 7460011906979b4.  james Fleming: What would also be helpful is your firm informing me to how you'd be cutting service in the interim.  james Fleming: Again - I opted to pay to have a second box delivered BEFORE returning the box I have - thus trying to have a continuity of service..  Kaylene: This is not something we normally do so it is important when you contact us to transfer the service back to the new box when you receive it that you reference this case number: 110622-006089.  Kaylene: I apologize about the inconvenience. You may need  force a few connections for the box to recognize the service again.  james Fleming: If it's not something you normally do than WHY would you have a $109 fee and a term for the service.  james Fleming: I am not mad at you, but your company is not impressing me and I'm blogging about this experience  Kaylene: Again I apologize about the inconvenience but you should be good to go now. Is there anything else I can help you with today?  james Fleming: so I need to go through the re-actviate process or is that somethign you do  Kaylene: When you receive the new TiVo box you need to contact us so we can transfer the service to the new box for you.  james Fleming: sure  Kaylene: Is there anything else I can help you with today James?  james Fleming: Nope - please email this transcript to me  Kaylene: I apologize but we do not have the ability to e-mail you a copy of this transcript. You can view it online at  http://www.tivo.com when you sign into your account or you can copy and paste it now to save it.  Kaylene: Thank you for contacting TiVo today. Your reference number for our conversation is 110622-006089. You can save this for your records, and if necessary, provide this to a later agent to pull up what we discussed. There will be a brief satisfaction survey emailed to you. We would appreciate any feedback on your TiVo Chat Support experience today.  Kaylene: Thank you for using TiVo Chat and have a great day James! Good-bye.  Kaylene has disconnected.

    Read the article

  • Perm SSIS Developer Urgently Required

    - by blakmk
      Job Role To provide dedicated data services support to the company, by designing, creating, maintaining and enhancing database objects, ensuring data quality, consistency and integrity. Migrating data from various sources to central SQL 2008 data warehouse will be the primary function. Migration of data from bespoke legacy database’s to SQL 2008 data warehouse. Understand key business requirements, Liaising with various aspects of the company. Create advanced transformations of data, with focus on data cleansing, redundant data and duplication. Creating complex business rules regarding data services, migration, Integrity and support (Best Practices). Experience ·         Minimum 3 year SSIS experience, in a project or BI Development role and involvement in at least 3 full ETL project life cycles, using the following methodologies and tools o    Excellent knowledge of ETL concepts including data migration & integrity, focusing on SSIS. o    Extensive experience with SQL 2005 products, SQL 2008 desirable. o    Working knowledge of SSRS and its integration with other BI products. o    Extensive knowledge of T-SQL, stored procedures, triggers (Table/Database), views, functions in particular coding and querying. o    Data cleansing and harmonisation. o    Understanding and knowledge of indexes, statistics and table structure. o    SQL Agent – Scheduling jobs, optimisation, multiple jobs, DTS. o    Troubleshoot, diagnose and tune database and physical server performance. o    Knowledge and understanding of locking, blocks, table and index design and SQL configuration. ·         Demonstrable ability to understand and analyse business processes. ·         Experience in creating business rules on best practices for data services. ·         Experience in working with, supporting and troubleshooting MS SQL servers running enterprise applications ·         Proven ability to work well within a team and liaise with other technical support staff such as networking administrators, system administrators and support engineers. ·         Ability to create formal documentation, work procedures, and service level agreements. ·         Ability to communicate technical issues at all levels including to a non technical audience. ·         Good working knowledge of MS Word, Excel, PowerPoint, Visio and Project.   Location Based in Crawley with possibility of some remote working Contact me for more info: http://sqlblogcasts.com/blogs/blakmk/contact.aspx      

    Read the article

  • How do I sync music to my Sony Walkman (Z Series) using Rhythmbox?

    - by Mark Paskal
    I have recently purchased the new Walkman Z from Sony. I can transfer music by mounting it as a drive, but I would prefer to use Rhythmbox to do so. My other Android devices and MP3 players from the past have always just shown up without any tweaking. Using Nautilus to transfer the files is possible but for some reason Nautilus still makes a trash folder on removable drives. Deleting the trash folder anew is really annoying to do every time I delete music from the device. How can I use Rhythmbox to transfer and remove songs instead?

    Read the article

  • Can't control connection bit rate using iwconfig with Atheros TL-WN821N (AR7010)

    - by Paul H
    I'm trying to reduce the connection bit rate on my Atheros TP-Link TL-WN821N v3 usb wifi adapter due to frequent instability issues (reported connection speed goes down to 1Mb/s and I have to physically reconnect the adapter to regain a connection). I know this is a common problem with this device, and I have tried everything I can think of to fix it, including using drivers from linux-backports; compiling and installing a custom firmware (following instructions on https://wiki.debian.org/ath9k_htc#fw-free) and (as a last resort) using ndiswrapper. When using ndiswrapper, the wifi adapter is stable and operates in g mode at 54Mb/s (whilst when using the default ath9k_htc module, the adapter connects in n mode and the bit rate fluctuates constantly). Unfortunately, with this setup I have to run my processor using only one core, since using SMP with ndiswrapper causes a kernel oops on my system. So I want to lock my bit rate to 54Mb/s (or less, if need be) for connection stability, using the ath9k_htc module. I've tried 'sudo iwconfig wlan0 rate 54M'; the command runs with no error but when I check the bit rate with 'sudo iwlist wlan0 bitrate' the command returns: wlan0 unknown bit-rate information. Current Bit Rate:78 Mb/s Any ideas? Here's some info (hopefully relevant) on my setup: Xubuntu (12.04.3) 64bit (kernel 3.2.0-55.85-generic) using Network Manager. My Router is from Virgin Media, the VMDG480. lshw -C network : *-network description: Wireless interface physical id: 1 bus info: usb@1:4 logical name: wlan0 serial: 74:ea:3a:8f:16:b6 capabilities: ethernet physical wireless configuration: broadcast=yes driver=ath9k_htc driverversion=3.2.0-55 firmware=1.3 ip=192.168.0.9 link=yes multicast=yes wireless=IEEE 802.11bgn lsusb -v: Bus 001 Device 003: ID 0cf3:7015 Atheros Communications, Inc. TP-Link TL-WN821N v3 802.11n [Atheros AR7010+AR9287] Device Descriptor: bLength 18 bDescriptorType 1 bcdUSB 2.00 bDeviceClass 255 Vendor Specific Class bDeviceSubClass 255 Vendor Specific Subclass bDeviceProtocol 255 Vendor Specific Protocol bMaxPacketSize0 64 idVendor 0x0cf3 Atheros Communications, Inc. idProduct 0x7015 TP-Link TL-WN821N v3 802.11n [Atheros AR7010+AR9287] bcdDevice 2.02 iManufacturer 16 ATHEROS iProduct 32 UB95 iSerial 48 12345 bNumConfigurations 1 Configuration Descriptor: bLength 9 bDescriptorType 2 wTotalLength 60 bNumInterfaces 1 bConfigurationValue 1 iConfiguration 0 bmAttributes 0x80 (Bus Powered) MaxPower 500mA Interface Descriptor: bLength 9 bDescriptorType 4 bInterfaceNumber 0 bAlternateSetting 0 bNumEndpoints 6 bInterfaceClass 255 Vendor Specific Class bInterfaceSubClass 0 bInterfaceProtocol 0 iInterface 0 Endpoint Descriptor: bLength 7 bDescriptorType 5 bEndpointAddress 0x01 EP 1 OUT bmAttributes 2 Transfer Type Bulk Synch Type None Usage Type Data wMaxPacketSize 0x0200 1x 512 bytes bInterval 0 Endpoint Descriptor: bLength 7 bDescriptorType 5 bEndpointAddress 0x82 EP 2 IN bmAttributes 2 Transfer Type Bulk Synch Type None Usage Type Data wMaxPacketSize 0x0200 1x 512 bytes bInterval 0 Endpoint Descriptor: bLength 7 bDescriptorType 5 bEndpointAddress 0x83 EP 3 IN bmAttributes 3 Transfer Type Interrupt Synch Type None Usage Type Data wMaxPacketSize 0x0040 1x 64 bytes bInterval 1 Endpoint Descriptor: bLength 7 bDescriptorType 5 bEndpointAddress 0x04 EP 4 OUT bmAttributes 3 Transfer Type Interrupt Synch Type None Usage Type Data wMaxPacketSize 0x0040 1x 64 bytes bInterval 1 Endpoint Descriptor: bLength 7 bDescriptorType 5 bEndpointAddress 0x05 EP 5 OUT bmAttributes 2 Transfer Type Bulk Synch Type None Usage Type Data wMaxPacketSize 0x0200 1x 512 bytes bInterval 0 Endpoint Descriptor: bLength 7 bDescriptorType 5 bEndpointAddress 0x06 EP 6 OUT bmAttributes 2 Transfer Type Bulk Synch Type None Usage Type Data wMaxPacketSize 0x0200 1x 512 bytes bInterval 0 Device Qualifier (for other device speed): bLength 10 bDescriptorType 6 bcdUSB 2.00 bDeviceClass 255 Vendor Specific Class bDeviceSubClass 255 Vendor Specific Subclass bDeviceProtocol 255 Vendor Specific Protocol bMaxPacketSize0 64 bNumConfigurations 1 Device Status: 0x0000 (Bus Powered) iwlist wlan0 scanning: wlan0 Scan completed : Cell 01 - Address: C4:3D:C7:3A:1F:5D Channel:1 Frequency:2.412 GHz (Channel 1) Quality=37/70 Signal level=-73 dBm Encryption key:on ESSID:"my essid" Bit Rates:1 Mb/s; 2 Mb/s; 5.5 Mb/s; 11 Mb/s; 18 Mb/s 24 Mb/s; 36 Mb/s; 54 Mb/s Bit Rates:6 Mb/s; 9 Mb/s; 12 Mb/s; 48 Mb/s Mode:Master Extra:tsf=00000070cca77186 Extra: Last beacon: 5588ms ago IE: Unknown: 0007756E69636F726E IE: Unknown: 010882848B962430486C IE: Unknown: 030101 IE: Unknown: 2A0100 IE: Unknown: 2F0100 IE: IEEE 802.11i/WPA2 Version 1 Group Cipher : TKIP Pairwise Ciphers (2) : CCMP TKIP Authentication Suites (1) : PSK IE: Unknown: 32040C121860 IE: Unknown: 2D1AFC181BFFFF000000000000000000000000000000000000000000 IE: Unknown: 3D1601080400000000000000000000000000000000000000 IE: Unknown: DD7E0050F204104A0001101044000102103B00010310470010F99C335D7BAC57FB00137DFA79600220102100074E657467656172102300074E6574676561721024000631323334353610420007303030303030311054000800060050F20400011011000743473331303144100800022008103C0001011049000600372A000120 IE: Unknown: DD090010180203F02C0000 IE: WPA Version 1 Group Cipher : TKIP Pairwise Ciphers (2) : CCMP TKIP Authentication Suites (1) : PSK IE: Unknown: DD180050F2020101800003A4000027A4000042435E0062322F00 iwconfig: lo no wireless extensions. wlan0 IEEE 802.11bgn ESSID:"my essid" Mode:Managed Frequency:2.412 GHz Access Point: C4:3D:C7:3A:1F:5D Bit Rate=78 Mb/s Tx-Power=20 dBm Retry long limit:7 RTS thr:off Fragment thr:off Power Management:off Link Quality=36/70 Signal level=-74 dBm Rx invalid nwid:0 Rx invalid crypt:0 Rx invalid frag:0 Tx excessive retries:0 Invalid misc:0 Missed beacon:0,

    Read the article

  • Slow tranfer to external USB3 hard drive

    - by JMP
    Trying to backup data from hard drive before reloading windows following some issue with its load. Having trouble with the file transfer to a USB3/2 external hard drive NTFS. Getting transfer speed of about 116.7kB/sec. In other words its taking about 5 hours to tranfer 1.4GB. I've got about 80GB to go. So the transfer is going to take 11days. Seems a little on the slow side. Am I missing something? Is there a way to make this faster. No issue with the external drive tranferring this amount in windows. But don't have that option at the moment.

    Read the article

  • WebCenter Customer Spotlight: Guizhou Power Grid Company

    - by me
    Author: Peter Reiser - Social Business Evangelist, Oracle WebCenter  Solution SummaryGuizhou Power Grid Company is responsible for power grid planning, construction, management, and power distribution in Guizhou Province, serving 39 million people. Giuzhou has 49,823 employees and an annual revenue of over $5 Billion. The business objectives were to consolidate information contained in disparate systems into a single knowledge repository and provide a safe and efficient way for staff and managers to access, query, share, manage, and store business information. Guizhou Power Grid Company saved more than US$693,000 in storage costs, reduced  average search times from 180 seconds to 5 seconds and solved 80% to 90% of technology and maintenance issues by searching the Oracle WebCenter Content management system. Company OverviewA wholly owned subsidiary of China Southern Power Grid Company Limited, Guizhou Power Grid Company is responsible for power grid planning, construction, management, and power distribution in Guizhou Province, serving 39 million people. Giuzhou has 49,823 employees and an annual revenue of over $5 Billion. Business ChallengesThe business objectives were to consolidate information contained in disparate systems, such as the customer relationship management and power grid management systems, into a single knowledge repository and provide a safe and efficient way for staff and managers to access, query, share, manage, and store business information. Solution DeployedGuizhou Power Grid Company  implemented Oracle WebCenter Content to build a content management system that enabled the secure, integrated management and storage of information, such as documents, records, images, Web content, and digital assets. The content management solution was integrated with the power grid, customer service, maintenance, and other business systems, as well as the corporate Web site. Business Results Saved more than US$693,000 in storage costs and shortened the material distribution time by integrating the knowledge management solution with the power grid, customer service, maintenance, and other business systems, as well as the corporate Web site Enabled staff to search 31,650 documents using catalogs, multidimensional attributes, and knowledge maps, reducing average search times from 180 seconds to 5 seconds and saving approximately 1,539 hours in annual search time Gained comprehensive document management, format transformation, security, and auditing capabilities Enabled users to upload new documents and supervisors to check the accuracy of these documents online, resulting in improved information quality control Solved 80% to 90% of technology and maintenance issues by searching the Oracle content management system for information, ensuring IT staff can respond quickly to users’ technical problems Improved security by using role-based access controls to restrict access to confidential documents and information Supported the efficient classification of corporate knowledge by using Oracle’s metadata functions to collect, tag, and archive documents, images, Web content, and digital assets “We chose Oracle WebCenter Content, as it is an outstanding integrated content management platform. It has allowed us to establish a system to access, query, share, manage, and store our corporate assets. This has laid a solid foundation for Guizhou Power Grid Company to improve management practices.” Luo Sixi, Senior Information Consultant, Guizhou Power Grid Company Additional Information Guizhou Power Grid Company Customer Snapshot Oracle WebCenter Content

    Read the article

  • SQL Saturday 194 - Exeter

    - by Dave Ballantyne
    Many kudos goes to Jonathan and Annette Allen and the others on the team for confirming SQL Saturday 194 in Exeter on the 8th and 9th of March.  The event home page is here http://www.sqlsaturday.com/194/eventhome.aspx and I delighted that myself and Dave Morrison will be presenting a full day pre-con on the 8th on favourite subjects “TSQL and Internals”. Here is the full abstract : TSQL and internals - When faced with performance issues there are many lines of attack. Tuning the engine itself can get you so far, however for maximum effect you need to understand how the engine and how it translates SQL statements into performable actions. This is not a simple task, it is a massive task to deal with a multi-table join and the number of permutations can be immense. To back up this knowledge, we can create better performing TSQL and understand the impact that is has upon the engine and recognize the pitfalls and gotcha’s that exist in SQLServer. Ultimately, there is no ‘best way’ to perform a single task only many variations of ‘it depends’ , but now we can pick the most appropriate option for the required dataload. Over the years, there have been many myths and misconceptions have grown around the product, some have basis in older versions and some are just wrong. Continuing to build on the knowledge given so far these issue will be explored and broken down and proved or disproved. Finally we will look to the future and explore SQL Server 2012 and the new functionality that that brings and some of the common uses that we will be able to address. After completion of this days pre-con, attendees will have a more complete knowledge of execution plans, and how they relate to the physical and logical actions that SQLServer will be executing on their behalf. The attendees will also have a more rounded and fuller knowledge of TSQL and the implications of incorrectly defining a query. Dave is a fountain of knowledge on execution plans and optimizer internals and ,though i may flatter myself, I’m no shrinking violet when it comes to TSQL and such matters.  I hope that if you cant join us, then there are other pre-cons available from other experts in their fields that may ‘float you boat’ too.  The pre-con page is http://sqlsouthwest.co.uk/SQLSaturday_precon.htm Also, excitingly, this pre-con day is sponsored by Fusion-IO which is a great boon for the day. If you want a more of this then i am offering a 2 day TSQL course starting on the 19th of March. More details on this are available here

    Read the article

  • qftp mput rawcommand

    - by krishna
    Hello, I have a doubt regarding multiple file transfer with qftp. There is no direct way to transfer multiple files with qftp class. Well, I tried it using arbitrary ftp command "mput" with "rawCommand" in QFTP. But it doesnt work for me. Please let me know how I could do a multiple file transfer with qftp. Thanks,

    Read the article

  • Can't Remove Logical Drive/Array from HP P400

    - by Myles
    This is my first post here. Thank you in advance for any assistance with this matter. I'm trying to remove a logical drive (logical drive 2) and an array (array "B") from my Smart Array P400. The host is a DL580 G5 running 64-bit Red Hat Enterprise Linux Server release 5.7 (Tikanga). I am unable to remove the array using either hpacucli or cpqacuxe. I believe it is because of "OS Status: LOCKED". The file system that lives on this array has been unmounted. I do not want to reboot the host. Is there some way to "release" this logical drive so I can remove the array? Note that I do not need to preserve the data on logical drive 2. I intend to physically remove the drives from the machine and replace them with larger drives. I'm using the cciss kernel module that ships with Red Hat 5.7. Here is some information pertaining to the host and the P400 configuration: [root@gort ~]# cat /etc/redhat-release Red Hat Enterprise Linux Server release 5.7 (Tikanga) [root@gort ~]# uname -a Linux gort 2.6.18-274.el5 #1 SMP Fri Jul 8 17:36:59 EDT 2011 x86_64 x86_64 x86_64 GNU/Linux [root@gort ~]# rpm -qa | egrep '^(hp|cpq)' cpqacuxe-9.30-15.0 hp-health-9.25-1551.7.rhel5 hpsmh-7.1.2-3 hpdiags-9.3.0-466 hponcfg-3.1.0-0 hp-snmp-agents-9.25-2384.8.rhel5 hpacucli-9.30-15.0 [root@gort ~]# hpacucli HP Array Configuration Utility CLI 9.30.15.0 Detecting Controllers...Done. Type "help" for a list of supported commands. Type "exit" to close the console. => ctrl all show config detail Smart Array P400 in Slot 0 (Embedded) Bus Interface: PCI Slot: 0 Cache Serial Number: PA82C0J9SVW34U RAID 6 (ADG) Status: Enabled Controller Status: OK Hardware Revision: D Firmware Version: 7.22 Rebuild Priority: Medium Expand Priority: Medium Surface Scan Delay: 15 secs Surface Scan Mode: Idle Wait for Cache Room: Disabled Surface Analysis Inconsistency Notification: Disabled Post Prompt Timeout: 0 secs Cache Board Present: True Cache Status: OK Cache Ratio: 25% Read / 75% Write Drive Write Cache: Disabled Total Cache Size: 256 MB Total Cache Memory Available: 208 MB No-Battery Write Cache: Disabled Cache Backup Power Source: Batteries Battery/Capacitor Count: 1 Battery/Capacitor Status: OK SATA NCQ Supported: True Logical Drive: 1 Size: 136.7 GB Fault Tolerance: RAID 1 Heads: 255 Sectors Per Track: 32 Cylinders: 35132 Strip Size: 128 KB Full Stripe Size: 128 KB Status: OK Caching: Enabled Unique Identifier: 600508B100184A395356573334550002 Disk Name: /dev/cciss/c0d0 Mount Points: /boot 101 MB, /tmp 7.8 GB, /usr 3.9 GB, /usr/local 2.0 GB, /var 3.9 GB, / 2.0 GB, /local 113.2 GB OS Status: LOCKED Logical Drive Label: A0027AA78DEE Mirror Group 0: physicaldrive 1I:1:2 (port 1I:box 1:bay 2, SAS, 146 GB, OK) Mirror Group 1: physicaldrive 1I:1:1 (port 1I:box 1:bay 1, SAS, 146 GB, OK) Drive Type: Data Array: A Interface Type: SAS Unused Space: 0 MB Status: OK Array Type: Data physicaldrive 1I:1:1 Port: 1I Box: 1 Bay: 1 Status: OK Drive Type: Data Drive Interface Type: SAS Size: 146 GB Rotational Speed: 10000 Firmware Revision: HPDE Serial Number: 3NM57RF40000983878FX Model: HP DG146BB976 Current Temperature (C): 29 Maximum Temperature (C): 35 PHY Count: 2 PHY Transfer Rate: Unknown, Unknown physicaldrive 1I:1:2 Port: 1I Box: 1 Bay: 2 Status: OK Drive Type: Data Drive Interface Type: SAS Size: 146 GB Rotational Speed: 10000 Firmware Revision: HPDE Serial Number: 3NM55VQC000098388524 Model: HP DG146BB976 Current Temperature (C): 29 Maximum Temperature (C): 36 PHY Count: 2 PHY Transfer Rate: Unknown, Unknown Logical Drive: 2 Size: 546.8 GB Fault Tolerance: RAID 5 Heads: 255 Sectors Per Track: 32 Cylinders: 65535 Strip Size: 64 KB Full Stripe Size: 256 KB Status: OK Caching: Enabled Parity Initialization Status: Initialization Completed Unique Identifier: 600508B100184A395356573334550003 Disk Name: /dev/cciss/c0d1 Mount Points: None OS Status: LOCKED Logical Drive Label: A5C9C6F81504 Drive Type: Data Array: B Interface Type: SAS Unused Space: 0 MB Status: OK Array Type: Data physicaldrive 1I:1:3 Port: 1I Box: 1 Bay: 3 Status: OK Drive Type: Data Drive Interface Type: SAS Size: 146 GB Rotational Speed: 10000 Firmware Revision: HPDE Serial Number: 3NM2H5PE00009802NK19 Model: HP DG146ABAB4 Current Temperature (C): 30 Maximum Temperature (C): 37 PHY Count: 1 PHY Transfer Rate: Unknown physicaldrive 1I:1:4 Port: 1I Box: 1 Bay: 4 Status: OK Drive Type: Data Drive Interface Type: SAS Size: 146 GB Rotational Speed: 10000 Firmware Revision: HPDE Serial Number: 3NM28YY400009750MKPJ Model: HP DG146ABAB4 Current Temperature (C): 31 Maximum Temperature (C): 36 PHY Count: 1 PHY Transfer Rate: 3.0Gbps physicaldrive 2I:1:5 Port: 2I Box: 1 Bay: 5 Status: OK Drive Type: Data Drive Interface Type: SAS Size: 146 GB Rotational Speed: 10000 Firmware Revision: HPDE Serial Number: 3NM2FGYV00009802N3GN Model: HP DG146ABAB4 Current Temperature (C): 30 Maximum Temperature (C): 38 PHY Count: 1 PHY Transfer Rate: Unknown physicaldrive 2I:1:6 Port: 2I Box: 1 Bay: 6 Status: OK Drive Type: Data Drive Interface Type: SAS Size: 146 GB Rotational Speed: 10000 Firmware Revision: HPDE Serial Number: 3NM8AFAK00009920MMV1 Model: HP DG146BB976 Current Temperature (C): 31 Maximum Temperature (C): 41 PHY Count: 2 PHY Transfer Rate: Unknown, Unknown physicaldrive 2I:1:7 Port: 2I Box: 1 Bay: 7 Status: OK Drive Type: Data Drive Interface Type: SAS Size: 146 GB Rotational Speed: 10000 Firmware Revision: HPDE Serial Number: 3NM2FJQD00009801MSHQ Model: HP DG146ABAB4 Current Temperature (C): 29 Maximum Temperature (C): 39 PHY Count: 1 PHY Transfer Rate: Unknown

    Read the article

  • VBA nested Loop flow control

    - by PCGIZMO
    I will be brief and stick to what I know. This code for the most part works as it should. The only issue is in the iteration of the x and z loop. these to loops should set the range and yLABEL for the Y loop. I can get through a set and come up with the correct range after that things go bonkers. I know some of it has to do with not breaking out of x to set z and then back to x update the range. It should work z is found then x. the range between them is set for y. then next x but y stays then rang between y and x is set for y.. so on and so forth kinda like a slinky down the stairs. or a slide rule depending on how I set the loops either way I end up all over the place after a couple iterations. I have done a few things but each time I break out of x to set z , X restarts at the top of the range. At least that's what I think I am seeing. In the example sheet i have since changed the way the way the offset works with the loop but the idea is still the same. I have goto statements at this time i was going to try figuring out conditional switches after the loops were working. Any help direction or advice is appreciated. Option Explicit Sub parse() Application.DisplayAlerts = False 'Application.EnableCancelKey = xlDisabled Dim strPath As String, strPathused As String strPath = "C:\clerk plan2" Dim objfso As FileSystemObject, objFolder As Folder, objfile As Object Set objfso = CreateObject("Scripting.FileSystemObject") Set objFolder = objfso.GetFolder(strPath) 'Loop through objWorkBooks For Each objfile In objFolder.Files If objfso.GetExtensionName(objfile.Path) = "xlsx" Then Dim objWorkbook As Workbook Set objWorkbook = Workbooks.Open(objfile.Path) ' Set path for move to at end of script strPathused = "C:\prodplan\used\" & objWorkbook.Name objWorkbook.Worksheets("inbound transfer sheet").Activate objWorkbook.Worksheets("inbound transfer sheet").Cells.UnMerge 'Range management WB Dim SRCwb As Worksheet, SRCrange1 As Range, SRCrange2 As Range, lastrow As Range Set SRCwb = objWorkbook.Worksheets("inbound transfer sheet") Set SRCrange1 = SRCwb.Range("g3:g150") Set SRCrange2 = SRCwb.Range("a1:a150") Dim DSTws As Worksheet Set DSTws = Workbooks("clerkplan2.xlsm").Worksheets("transfer") Dim STR1 As String, STR2 As String, xVAL As String, zVAL As String, xSTR As String, zSTR As String STR1 = "INBOUND TRANS" STR2 = "INBOUND CA TRANS" Dim x As Variant, z As Variant, y As Variant, zxRANGE As Range For Each z In SRCrange2 zSTR = Mid(z, 1, 16) If zSTR <> STR2 Then GoTo zNEXT If zSTR = STR2 Then zVAL = z End If For Each x In SRCrange2 xSTR = Mid(x, 1, 13) If xSTR <> STR1 Then GoTo xNEXT If xSTR = STR1 Then xVAL = x End If Dim yLABEL As String If xVAL = x And zVAL = z Then If x.Row > z.Row Then Set zxRANGE = SRCwb.Range(x.Offset(1, 0).Address & " : " & z.Offset(-1, 0).Address) yLABEL = z.Value Else Set zxRANGE = SRCwb.Range(z.Offset(-1, 0).Address & " : " & x.Offset(1, 0).Address) yLABEL = x.Value End If End If MsgBox zxRANGE.Address ' DEBUG For Each y In zxRANGE If y.Offset(0, 6) = "Temp" Or y.Offset(0, 14) = "Begin Time" Or y.Offset(0, 15) = "End Time" Or _ Len(y.Offset(0, 6)) = 0 Or Len(y.Offset(0, 14)) = 0 Or Len(y.Offset(0, 15)) = "0" Then yNEXT Set lastrow = Workbooks("clerkplan2.xlsm").Worksheets("transfer").Range("c" & DSTws.Rows.Count).End(xlUp).Offset(1, 0) y.Offset(0, 6).Copy lastrow.PasteSpecial Paste:=xlPasteValues, operation:=xlNone, skipblanks:=True, Transpose:=False DSTws.Activate ActiveCell.Offset(0, -1) = objWorkbook.Name ActiveCell.Offset(0, -2) = yLABEL objWorkbook.Activate y.Offset(0, 14).Copy Set lastrow = Workbooks("clerkplan2.xlsm").Worksheets("transfer").Range("d" & DSTws.Rows.Count).End(xlUp).Offset(1, 0) lastrow.PasteSpecial Paste:=xlPasteValues, operation:=xlNone, skipblanks:=True, Transpose:=False objWorkbook.Activate y.Offset(0, 15).Copy Set lastrow = Workbooks("clerkplan2.xlsm").Worksheets("transfer").Range("e" & DSTws.Rows.Count).End(xlUp).Offset(1, 0) lastrow.PasteSpecial Paste:=xlPasteValues, operation:=xlNone, skipblanks:=True, Transpose:=False yNEXT: Next y xNEXT: Next x zNEXT: Next z strPathused = "C:\clerk plan2\used\" & objWorkbook.Name objWorkbook.Close False 'Move proccesed file to new Dir Dim OldFilePath As String Dim NewFilePath As String OldFilePath = objfile 'original file location NewFilePath = strPathused ' new file location Name OldFilePath As NewFilePath ' move the file End If Next End Sub

    Read the article

  • WHMCS - Mapping of a Manually Created Invoice with its Corresponding Domain

    - by Knowledge Craving
    I am using WHMCS version 4.2.1 for maintaining domain registration & website hosting services, in one of my websites. Currently, the automatic domain registration process is working great for both the existing & new clients. The main way to register domains automatically is to mark the respective invoices as paid, and automatically the domains get registered through my selected registrar. Recently, I am facing a major problem in domain renewals, where the domain has been already registered by us in the past. The problem is that some of the corresponding invoices are not already generated for the domains & so I have to manually create invoice for each of those domain renewals. However, I am unable to map that invoice with the corresponding domain. This is required because unless the domain knows that for its renewal, an invoice has been created, the marking of the invoice as paid will not instantiate the automatic renewal process through my registrar. Can anybody please tell me the probable way of mapping the invoice with its corresponding domain? I've tried to explain the problem as it is occurring in the best way possible. Still, if any more information regarding this is required, please ask. Any help is greatly appreciated.

    Read the article

  • Difference between two commands of fetching Shopping Cart Items in Magento

    - by Knowledge Craving
    In Magento, if you need to get / fetch the Shopping Cart's Item details, you can do it in any of the two possible ways, which will provide you with all the shopped Items in an array:- $cartItems1 = $cart->getQuote()->getAllItems(); $cartItems2 = $cart->getItems()->getData(); But before using any one of the above two methods, you need to initialize the shopping cart object as:- $cart = new Mage_Checkout_Model_Cart(); $cart->init(); Can anyone please describe in details as to what the two options provide & their differences between each other, along with their possible usage. In any more such option is available in Magento, can anyone please highlight it?

    Read the article

< Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >