Search Results

Search found 45469 results on 1819 pages for 'open source contributions'.

Page 224/1819 | < Previous Page | 220 221 222 223 224 225 226 227 228 229 230 231  | Next Page >

  • Does Socket open another thread? Does it return something?

    - by Roman
    In the client application I call new Socket(serverIP,serverPort). As a result the client application sends a request to the server application to open a socket. Does it start a new thread? I mean which of the following is true? Client application sends a request and immediately starts to execute following commands (not weighting for the answer). Client sends the request and weights for the answer. As soon as the answer is obtained, the client application continues to execute following commands. The second case seems to be more realistic and logical for me. However, I do not understand what happens if the server does not open a socket and it does not say that it does not "want" to open the second (it can happen if the server does not exist or network is broken). What will happen in this case? Will server weight forever? In general it would be nice for the client to know what is the result of its request for the socket. For example I can imagine the following situations: The socket is opened by the server. The server refuses to open a socket. So, server exists, it got the request from the client but it says "no". There is no response from the server. I know that new Socket(serverIP,serverPort) does not "return" this kind of information. But it throws exceptions. One of them is "UnkownHostException". When it is thrown? When the server is not responding for a while (for how long)? ADDED: I just found out that UnknownHostException is thrown to indicate that the IP address of a host could not be determined. So, it is unrelated with the above described situations (server is not responding, server refuses to open a socket).

    Read the article

  • Is it legal to take sealed .NET framework class source and extend it?

    - by Giedrius
    To be short, I'm giving very specific example, but I'm interested in general situation. There is a FtpWebRequest class in .NET framework and it is missing some of new FTP operations, like MFCT. It is ok in a sense that this operation is still in draft mode, but it is not ok in a sense, that FtpWebRequest is sealed and there's no other way (at least I don't see it) to extend it with this new operation. Easiest way to do it would be take FtpWebRequest class source from .NET reference sources and extend it, in such way will be kept all the consistence in naming, implementation, etc. Question is how much legal it is? I won't sell this class as a product, I can publish my changes on web - nothing to hide here. If it is not legal, can I take this class source from mono and include in native .net project? Did you had similar case and how you solved it? Update: as long as extension method is offered, I'm pasting source from .NET framework which should show that extension methods are not the solution. So there's a property Method, where you can pass FTP command: public override string Method { get { return m_MethodInfo.Method; } set { if (String.IsNullOrEmpty(value)) { throw new ArgumentException(SR.GetString(SR.net_ftp_invalid_method_name), "value"); } if (InUse) { throw new InvalidOperationException(SR.GetString(SR.net_reqsubmitted)); } try { m_MethodInfo = FtpMethodInfo.GetMethodInfo(value); } catch (ArgumentException) { throw new ArgumentException(SR.GetString(SR.net_ftp_unsupported_method), "value"); } } } As you can see there FtpMethodInfo.GetMethodInfo(value) call in setter, which basically validates value against internal enum static array. Update 2: Checked mono implementation and it is not exact replica of native code + it does not implement some of the things.

    Read the article

  • How do I open a file in such a way that if the file doesn't exist it will be created and opened automatically?

    - by snakile
    Here's how I open a file for writing+ : if( fopen_s( &f, fileName, "w+" ) !=0 ) { printf("Open file failed\n"); return; } fprintf_s(f, "content"); If the file doesn't exist the open operation fails. What's the right way to fopen if I want to create the file automatically if the file doesn't already exist? EDIT: If the file does exist, I would like fprintf to overwrite the file, not to append to it.

    Read the article

  • How to open files from explorer into different tabs.

    - by Priyank Bolia
    How to open files from explorer into different tabs. I can associate an open with menu with the file type, now when I already have the program working, how to open the new file into another tab, instead of new program. How to find the already running process exactly, not with the name and send the filename to it. Let me make myself clear: I want my app to be single instance, so that when user select 10 text files and press enter key, my application will open all the 10 text files into 10 tabs, instead of creating 10 processes. How to do that? How to communicate between various instances of the same process. EDIT SOLVED: Implemented the functionality using WM_COPYDATA in C# and the SingleApplication class from codeproject.

    Read the article

  • Cannot open database requested by the login. The login failed. Login failed for user

    - by Cipher
    Hi, I have copied a DB from one my computers and using it here. On trying to open the page which requires the fetching content from DB, on con.open I am getting this exception: Unable to open the physical file "E:\Program Files\Microsoft SQL Server\MSSQL10.SQLEXPRESS\MSSQL\DATA\cakephp.mdf". Operating system error 32: "32(The process cannot access the file because it is being used by another process.)". Unable to open the physical file "E:\Program Files\Microsoft SQL Server\MSSQL10.SQLEXPRESS\MSSQL\DATA\cakephp_log.LDF". Operating system error 32: "32(The process cannot access the file because it is being used by another process.)". Cannot open database "cakephp" requested by the login. The login failed. Login failed for user 'Sarin-PC\Sarin'. I have attached the database from Management Studio Express 2008 and I have also checcked the connection string. Here it is: <connectionStrings> <add name="cn" connectionString="server=.\sqlexpress;database=cakephp;integrated security=true;uid=sarin;pwd=******"/> </connectionStrings> In Visual Studio, when I test the connection, it says "Test connection succeeded". However, there is one strange thing going on. When I login to the Management Studio, there is no + sign with the newly attached database, as shown. If the full WebConfig is reqiured to be seen, I have pasted it here: http://pastebin.com/sVAuN0Ug

    Read the article

  • How can I automatically restore all open PDF files after rebooting in Windows?

    - by Coldblackice
    I've tried using "Cache My Work" (http://cachemywork.codeplex.com/), but unfortunately, it only restores one instance of a program that was open upon rebooting. So when I have five separate Adobe Acrobat Pro windows open (each with its own PDF document), when I reboot, Cache My Work will only reopen one of them (not sure how CMW chooses which PDF to reopen, either). Besides switching to another PDF program (like one with tabs), is there a program that can do this?

    Read the article

  • Perform SilkTest actions on an Eclipse that's always already open?

    - by Tom
    I want to be able to recognize the Eclipse window, which is always already open, so I won't ever need to open it with SilkTest. Is there a way that I can set the base state to be a window that's always going to be open? It seems the way to set the base state also designates the executable for it to open. The executable won't necessarily always be in the same location, and it would be a pain to configure that. Is this possible? I've already tried desktop.<Window>find("//Window[@caption='Java EE*']");, which doesn't work.

    Read the article

  • 'sudo apt-get update' error

    - by psilo
    I've been having an issue with 'sudo apt-get update' for several days now. I've tried every proposed solution I could find but to no avail. Here is the output to 'apt-get update'. Ign http://us.archive.ubuntu.com precise InRelease Ign http://us.archive.ubuntu.com precise-updates InRelease Ign http://us.archive.ubuntu.com precise-backports InRelease Ign http://us.archive.ubuntu.com precise-security InRelease Ign http://archive.ubuntu.com precise InRelease Err http://us.archive.ubuntu.com precise Release.gpg Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-updates Release.gpg Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-backports Release.gpg Unable to connect to 69.163.233.85:80: Err http://archive.ubuntu.com precise Release.gpg Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-security Release.gpg Unable to connect to 69.163.233.85:80: Ign http://us.archive.ubuntu.com precise Release Ign http://us.archive.ubuntu.com precise-updates Release Ign http://archive.ubuntu.com precise Release Ign http://us.archive.ubuntu.com precise-backports Release Ign http://us.archive.ubuntu.com precise-security Release Ign http://us.archive.ubuntu.com precise/main TranslationIndex Ign http://us.archive.ubuntu.com precise/multiverse TranslationIndex Ign http://us.archive.ubuntu.com precise/restricted TranslationIndex Ign http://us.archive.ubuntu.com precise/universe TranslationIndex Ign http://archive.ubuntu.com precise/main TranslationIndex Ign http://us.archive.ubuntu.com precise-updates/main TranslationIndex Ign http://us.archive.ubuntu.com precise-updates/multiverse TranslationIndex Ign http://us.archive.ubuntu.com precise-updates/restricted TranslationIndex Ign http://us.archive.ubuntu.com precise-updates/universe TranslationIndex Ign http://us.archive.ubuntu.com precise-backports/main TranslationIndex Ign http://us.archive.ubuntu.com precise-backports/multiverse TranslationIndex Ign http://us.archive.ubuntu.com precise-backports/restricted TranslationIndex Ign http://us.archive.ubuntu.com precise-backports/universe TranslationIndex Ign http://us.archive.ubuntu.com precise-security/main TranslationIndex Ign http://us.archive.ubuntu.com precise-security/multiverse TranslationIndex Ign http://us.archive.ubuntu.com precise-security/restricted TranslationIndex Ign http://us.archive.ubuntu.com precise-security/universe TranslationIndex Err http://archive.ubuntu.com precise/main Sources Unable to connect to 69.163.233.85:80: Err http://archive.ubuntu.com precise/main i386 Packages Unable to connect to 69.163.233.85:80: Err http://archive.ubuntu.com precise/main Translation-en_US Unable to connect to 69.163.233.85:80: Err http://archive.ubuntu.com precise/main Translation-en Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise/main Sources Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise/restricted Sources Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise/universe Sources Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise/multiverse Sources Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise/main i386 Packages Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise/restricted i386 Packages Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise/universe i386 Packages Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise/multiverse i386 Packages Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-updates/main Sources Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-updates/restricted Sources Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-updates/universe Sources Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-updates/multiverse Sources Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-updates/main i386 Packages Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-updates/restricted i386 Packages Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-updates/universe i386 Packages Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-updates/multiverse i386 Packages Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-backports/main Sources Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-backports/restricted Sources Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-backports/universe Sources Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-backports/multiverse Sources Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-backports/main i386 Packages Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-backports/restricted i386 Packages Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-backports/universe i386 Packages Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-backports/multiverse i386 Packages Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-security/main Sources Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-security/restricted Sources Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-security/universe Sources Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-security/multiverse Sources Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-security/main i386 Packages Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-security/restricted i386 Packages Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-security/universe i386 Packages Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-security/multiverse i386 Packages Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise/main Translation-en_US Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise/main Translation-en Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise/multiverse Translation-en_US Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise/multiverse Translation-en Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise/restricted Translation-en_US Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise/restricted Translation-en Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise/universe Translation-en_US Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise/universe Translation-en Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-updates/main Translation-en_US Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-updates/main Translation-en Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-updates/multiverse Translation-en_US Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-updates/multiverse Translation-en Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-updates/restricted Translation-en_US Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-updates/restricted Translation-en Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-updates/universe Translation-en_US Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-updates/universe Translation-en Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-backports/main Translation-en_US Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-backports/main Translation-en Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-backports/multiverse Translation-en_US Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-backports/multiverse Translation-en Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-backports/restricted Translation-en_US Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-backports/restricted Translation-en Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-backports/universe Translation-en_US Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-backports/universe Translation-en Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-security/main Translation-en_US Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-security/main Translation-en Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-security/multiverse Translation-en_US Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-security/multiverse Translation-en Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-security/restricted Translation-en_US Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-security/restricted Translation-en Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-security/universe Translation-en_US Unable to connect to 69.163.233.85:80: Err http://us.archive.ubuntu.com precise-security/universe Translation-en Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise/Release.gpg Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-updates/Release.gpg Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backports/Release.gpg Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-security/Release.gpg Unable to connect to 69.163.233.85:80: W: Failed to fetch http://archive.ubuntu.com/dists/precise/Release.gpg Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise/main/source/Sources Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise/restricted/source/Sources Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise/universe/source/Sources Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise/multiverse/source/Sources Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise/main/binary-i386/Packages Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise/restricted/binary-i386/Packages Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise/universe/binary-i386/Packages Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise/multiverse/binary-i386/Packages Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-updates/main/source/Sources Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-updates/restricted/source/Sources Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-updates/universe/source/Sources Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-updates/multiverse/source/Sources Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-updates/main/binary-i386/Packages Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-updates/restricted/binary-i386/Packages Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-updates/universe/binary-i386/Packages Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-updates/multiverse/binary-i386/Packages Unable to connect to 69.163.233.85:80: W: Failed to fetch http://archive.ubuntu.com/dists/precise/main/source/Sources Unable to connect to 69.163.233.85:80: W: Failed to fetch http://archive.ubuntu.com/dists/precise/main/binary-i386/Packages Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backports/main/source/Sources Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backports/restricted/source/Sources Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backports/universe/source/Sources Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backports/multiverse/source/Sources Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backports/main/binary-i386/Packages Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backports/restricted/binary-i386/Packages Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backports/universe/binary-i386/Packages Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backports/multiverse/binary-i386/Packages Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-security/main/source/Sources Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-security/restricted/source/Sources Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-security/universe/source/Sources Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-security/multiverse/source/Sources Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-security/main/binary-i386/Packages Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-security/restricted/binary-i386/Packages Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-security/universe/binary-i386/Packages Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-security/multiverse/binary-i386/Packages Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise/main/i18n/Translation-en_US Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise/main/i18n/Translation-en Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise/multiverse/i18n/Translation-en_US Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise/multiverse/i18n/Translation-en Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise/restricted/i18n/Translation-en_US Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise/restricted/i18n/Translation-en Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise/universe/i18n/Translation-en_US Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise/universe/i18n/Translation-en Unable to connect to 69.163.233.85:80: W: Failed to fetch http://archive.ubuntu.com/dists/precise/main/i18n/Translation-en_US Unable to connect to 69.163.233.85:80: W: Failed to fetch http://archive.ubuntu.com/dists/precise/main/i18n/Translation-en Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-updates/main/i18n/Translation-en_US Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-updates/main/i18n/Translation-en Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-updates/multiverse/i18n/Translation-en_US Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-updates/multiverse/i18n/Translation-en Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-updates/restricted/i18n/Translation-en_US Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-updates/restricted/i18n/Translation-en Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-updates/universe/i18n/Translation-en_US Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-updates/universe/i18n/Translation-en Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backports/main/i18n/Translation-en_US Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backports/main/i18n/Translation-en Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backports/multiverse/i18n/Translation-en_US Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backports/multiverse/i18n/Translation-en Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backports/restricted/i18n/Translation-en_US Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backports/restricted/i18n/Translation-en Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backports/universe/i18n/Translation-en_US Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-backports/universe/i18n/Translation-en Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-security/main/i18n/Translation-en_US Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-security/main/i18n/Translation-en Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-security/multiverse/i18n/Translation-en_US Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-security/multiverse/i18n/Translation-en Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-security/restricted/i18n/Translation-en_US Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-security/restricted/i18n/Translation-en Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-security/universe/i18n/Translation-en_US Unable to connect to 69.163.233.85:80: W: Failed to fetch http://us.archive.ubuntu.com/ubuntu/dists/precise-security/universe/i18n/Translation-en Unable to connect to 69.163.233.85:80: E: Some index files failed to download. They have been ignored, or old ones used instead.

    Read the article

  • IP Tables won't save the rule.

    - by ArchUser
    Hello, I'm using ArchLinux and I have an IP tables rule that I know works (from my other server), and it's in /etc/iptables/iptables.rules, it's the only rule set in that directory. I run, /etc/rc.d/iptables save, then /etc/rc.d/iptables/restart, but when I do "iptables --list", I get ACCEPTs on INPUT,FORWARD & OUTPUT. # Generated by iptables-save v1.4.8 on Sat Jan 8 18:42:50 2011 *filter :INPUT DROP [0:0] :FORWARD DROP [0:0] :OUTPUT ACCEPT [216:14865] :BRUTEGUARD - [0:0] :interfaces - [0:0] :open - [0:0] -A INPUT -p icmp -m icmp --icmp-type 18 -j DROP -A INPUT -p icmp -m icmp --icmp-type 17 -j DROP -A INPUT -p icmp -m icmp --icmp-type 10 -j DROP -A INPUT -p icmp -m icmp --icmp-type 9 -j DROP -A INPUT -p icmp -m icmp --icmp-type 5 -j DROP -A INPUT -p icmp -j ACCEPT -A INPUT -m state --state RELATED,ESTABLISHED -j ACCEPT -A INPUT -j interfaces -A INPUT -j open -A INPUT -p tcp -j REJECT --reject-with tcp-reset -A INPUT -p udp -j REJECT --reject-with icmp-port-unreachable -A INPUT -p tcp -m tcp ! --tcp-flags FIN,SYN,RST,ACK SYN -m state --state NEW -j DROP -A INPUT -f -j DROP -A INPUT -p tcp -m tcp --tcp-flags FIN,SYN,RST,PSH,ACK,URG FIN,SYN,RST,PSH,ACK,URG -j DROP -A INPUT -p tcp -m tcp --tcp-flags FIN,SYN,RST,PSH,ACK,URG NONE -j DROP -A INPUT -i eth+ -p icmp -m icmp --icmp-type 8 -j DROP -A BRUTEGUARD -m recent --set --name BF --rsource -A BRUTEGUARD -m recent --update --seconds 600 --hitcount 20 --name BF --rsource -j LOG --log-prefix "[BRUTEFORCE ATTEMPT] " --log-level 6 -A BRUTEGUARD -m recent --update --seconds 600 --hitcount 20 --name BF --rsource -j DROP -A interfaces -i lo -j ACCEPT -A open -p tcp -m tcp --dport 80 -j ACCEPT -A open -p tcp -m tcp --dport 10011 -j ACCEPT -A open -p udp -m udp --dport 9987 -j ACCEPT -A open -p tcp -m tcp --dport 30033 -j ACCEPT -A open -p tcp -m tcp --dport 8000 -j ACCEPT -A open -p tcp -m tcp --dport 8001 -j ACCEPT -A open -s 76.119.125.61 -p tcp -m tcp --dport 21 -j ACCEPT -A open -s 76.119.125.61 -p tcp -m tcp --dport 3306 -j ACCEPT -A open -p tcp -m tcp --dport 22 -j BRUTEGUARD -A open -s 76.119.125.61 -p tcp -m tcp --dport 22 -m state --state NEW,RELATED,ESTABLISHED -j ACCEPT COMMIT # Completed on Sat Jan 8 18:42:50 2011

    Read the article

  • different nmap results

    - by aasasas
    Hello I have a scan on my server form outside and from inside, why results are different? [root@xxx ~]# nmap -sV -p 0-65535 localhost Starting Nmap 5.51 ( http://nmap.org ) at 2011-02-16 07:59 MSK Nmap scan report for localhost (127.0.0.1) Host is up (0.000015s latency). rDNS record for 127.0.0.1: localhost.localdomain Not shown: 65534 closed ports PORT STATE SERVICE VERSION 22/tcp open ssh OpenSSH 4.3 (protocol 2.0) 80/tcp open http Apache httpd 2.2.3 ((CentOS)) Service detection performed. Please report any incorrect results at http://nmap.org/submit/ . Nmap done: 1 IP address (1 host up) scanned in 7.99 seconds AND sh-3.2# nmap -sV -p 0-65535 xxx.com Starting Nmap 5.51 ( http://nmap.org ) at 2011-02-16 00:01 EST Warning: Unable to open interface vmnet1 -- skipping it. Warning: Unable to open interface vmnet8 -- skipping it. Stats: 0:07:49 elapsed; 0 hosts completed (1 up), 1 undergoing SYN Stealth Scan SYN Stealth Scan Timing: About 36.92% done; ETC: 00:22 (0:13:21 remaining) Stats: 0:22:05 elapsed; 0 hosts completed (1 up), 1 undergoing Service Scan Service scan Timing: About 75.00% done; ETC: 00:23 (0:00:02 remaining) Nmap scan report for xxx.com (x.x.x.x) Host is up (0.22s latency). Not shown: 65528 closed ports PORT STATE SERVICE VERSION 21/tcp open tcpwrapped 22/tcp open ssh OpenSSH 4.3 (protocol 2.0) 25/tcp open tcpwrapped 80/tcp open http Apache httpd 2.2.3 ((CentOS)) 110/tcp open tcpwrapped 143/tcp open tcpwrapped 443/tcp open tcpwrapped 8080/tcp open http-proxy?

    Read the article

  • How could I embed formatted XML source in WORD documents?

    - by eckes
    I'm writing a documentation with WORD that contains XML source code (whole files) as examples. The way I'm embedding the currently XML is quite cumbersome and doesn't seem to me as really maintainable: I'm finishing the editing of the document in WORD and create a PDF from it using Acrobat next, I open my XML files (2x input files, 2x generated output files) with IE and print them with the PDF printer supplied by Acrobat now, I open up Acrobat Pro and attach the four XML-PDF files to my original document The problem with that work flow for me is that it involves too much manual labor in order to get the documentation done. What I've tried up to now is not really satisfying for me: converting each XML to PDF and appending them like described above opening the XML files with SCiTE, copy as RTF and paste into Word playing around with the LaTeX packages minted, pygments and listings (I could write the docs with LaTeX too) but found some unsolvable problems in each of these packages I'm searching for a way that produces my documentation more automatic. For example embedding the XML files including formatting of IE (which I find quite readable). The files should be included by reference so that I don't have to paste the XML sources manually every time the XML changes.

    Read the article

  • jQuery slide open and close menus. How to stop them going crazy? [closed]

    - by firefusion
    I want sub menu's of a verticle menu to expand and collapse when moused over. This is what i've got so far but it goes crazy if you do it to quick as all the animations run at once and on a delay. How can i make sure just one menu expands at a time. I've also set the current_page_item to be open but default and I don't want this to expand or collaspe. <ul> <li class="current_page_item"><a href="#">Parent Item</a> <ul class="children"> <li class="page_item"><a href="#">Child page</a></li> <li class="page_item"><a href="#">Child page</a></li> <li class="page_item"><a href="#">Child page</a></li> <li class="page_item"><a href="#">Child page</a></li> </ul> </li> <li class="page_item"><a href="#">Parent Item</a> <ul class="children"> <li class="page_item"><a href="#">Child page</a></li> <li class="page_item"><a href="#">Child page</a></li> </ul> </li> <li class="page_item"><a href="#">Parent Item</a> <ul class="children"> <li class="page_item"><a href="#">Child page</a></li> <li class="page_item"><a href="#">Child page</a></li> </ul> </li> <li class="page_item"><a href="#">Parent Item</a></li> <li class="page_item"><a href="#">Parent Item</a></li> </ul> jQuery('ul.children').hide(); jQuery('li.current_page_item ul.children').show(); jQuery('li.current_page_item').parent().show(); jQuery("li.page_item").hover(function() { jQuery(this).find('ul.children').delay(300).slideDown('slow'); }, function() { jQuery(this).find('ul.children').delay(300).slideUp('slow'); });

    Read the article

  • Problem compiling hive with ant

    - by conandor
    I compiling with Solaris 10 SPARC, jdk 1.6 from Sun, Ant 1.7.1 from OpenCSW. I have no problem running hadoop 0.17.2.1 However, I have problem compiling/integrating hive with the error 'cannot find symbol', although I followed the tutorial. I have the hive source code from SVN exactly from tutorial. How can I know the hive version I compiling and how can I compile against hadoop 0.17.2.1? Please advice. Thank you. -bash-3.00$ export PATH=/usr/jdk/instances/jdk1.6.0/bin:/usr/bin:/opt/csw/bin:/opt/webstack/bin -bash-3.00$ export JAVA_HOME=/usr/jdk/instances/jdk1.6.0 -bash-3.00$ export HADOOP=/export/home/mywork/hadoop-0.17.2.1/bin/hadoop -bash-3.00$ /opt/csw/bin/ant package -Dhadoop.version=0.17.2.1 Buildfile: build.xml jar: create-dirs: compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: compile: ivy-init-dirs: ivy-download: [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] To: /export/home/mywork/hive/build/ivy/lib/ivy-2.1.0.jar [get] Not modified - so not downloaded ivy-probe-antlib: ivy-init-antlib: ivy-init: ivy-retrieve-hadoop-source: [ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ :: [ivy:retrieve] :: loading settings :: file = /export/home/mywork/hive/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working@kaili [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.17.2.1 in hadoop-source [ivy:retrieve] found hadoop#core;0.18.3 in hadoop-source [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-source [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-source [ivy:retrieve] :: resolution report :: resolve 25878ms :: artifacts dl 37ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 4 | 0 | 0 | 0 || 4 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 4 already retrieved (0kB/82ms) install-hadoopcore-internal: build_shims: [echo] Compiling shims against hadoop 0.17.2.1 (/export/home/mywork/hive/build/hadoopcore/hadoop-0.17.2.1) ivy-init-dirs: ivy-download: [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] To: /export/home/mywork/hive/build/ivy/lib/ivy-2.1.0.jar [get] Not modified - so not downloaded ivy-probe-antlib: ivy-init-antlib: ivy-init: ivy-retrieve-hadoop-source: [ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ :: [ivy:retrieve] :: loading settings :: file = /export/home/mywork/hive/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working@kaili [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.17.2.1 in hadoop-source [ivy:retrieve] found hadoop#core;0.18.3 in hadoop-source [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-source [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-source [ivy:retrieve] :: resolution report :: resolve 12041ms :: artifacts dl 30ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 4 | 0 | 0 | 0 || 4 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 4 already retrieved (0kB/39ms) install-hadoopcore-internal: build_shims: [echo] Compiling shims against hadoop 0.18.3 (/export/home/mywork/hive/build/hadoopcore/hadoop-0.18.3) ivy-init-dirs: ivy-download: [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] To: /export/home/mywork/hive/build/ivy/lib/ivy-2.1.0.jar [get] Not modified - so not downloaded ivy-probe-antlib: ivy-init-antlib: ivy-init: ivy-retrieve-hadoop-source: [ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ :: [ivy:retrieve] :: loading settings :: file = /export/home/mywork/hive/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working@kaili [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.17.2.1 in hadoop-source [ivy:retrieve] found hadoop#core;0.18.3 in hadoop-source [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-source [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-source [ivy:retrieve] :: resolution report :: resolve 11107ms :: artifacts dl 36ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 4 | 0 | 0 | 0 || 4 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 4 already retrieved (0kB/49ms) install-hadoopcore-internal: build_shims: [echo] Compiling shims against hadoop 0.19.0 (/export/home/mywork/hive/build/hadoopcore/hadoop-0.19.0) ivy-init-dirs: ivy-download: [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] To: /export/home/mywork/hive/build/ivy/lib/ivy-2.1.0.jar [get] Not modified - so not downloaded ivy-probe-antlib: ivy-init-antlib: ivy-init: ivy-retrieve-hadoop-source: [ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ :: [ivy:retrieve] :: loading settings :: file = /export/home/mywork/hive/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working@kaili [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.17.2.1 in hadoop-source [ivy:retrieve] found hadoop#core;0.18.3 in hadoop-source [ivy:retrieve] found hadoop#core;0.19.0 in hadoop-source [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-source [ivy:retrieve] :: resolution report :: resolve 9969ms :: artifacts dl 33ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 4 | 0 | 0 | 0 || 4 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#shims [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 4 already retrieved (0kB/57ms) install-hadoopcore-internal: build_shims: [echo] Compiling shims against hadoop 0.20.0 (/export/home/mywork/hive/build/hadoopcore/hadoop-0.20.0) jar: [echo] Jar: shims create-dirs: compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: install-hadoopcore: install-hadoopcore-default: ivy-init-dirs: ivy-download: [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] To: /export/home/mywork/hive/build/ivy/lib/ivy-2.1.0.jar [get] Not modified - so not downloaded ivy-probe-antlib: ivy-init-antlib: ivy-init: ivy-retrieve-hadoop-source: [ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ :: [ivy:retrieve] :: loading settings :: file = /export/home/mywork/hive/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#common;working@kaili [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.20.0 in hadoop-source [ivy:retrieve] :: resolution report :: resolve 4864ms :: artifacts dl 13ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 1 | 0 | 0 | 0 || 1 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#common [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 1 already retrieved (0kB/52ms) install-hadoopcore-internal: setup: compile: [echo] Compiling: common jar: [echo] Jar: common create-dirs: compile-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks deploy-ant-tasks: create-dirs: init: compile: [echo] Compiling: anttasks jar: init: dynamic-serde: compile: [echo] Compiling: hive [javac] Compiling 167 source files to /export/home/mywork/hive/build/serde/classes [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/objectinspector/ObjectInspectorFactory.java:30: cannot find symbol [javac] symbol : class PrimitiveObjectInspectorFactory [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorFactory; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/objectinspector/ObjectInspectorFactory.java:31: cannot find symbol [javac] symbol : class PrimitiveObjectInspectorUtils [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorUtils; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/MetadataTypedColumnsetSerDe.java:31: cannot find symbol [javac] symbol : class MetadataListStructObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector [javac] import org.apache.hadoop.hive.serde2.objectinspector.MetadataListStructObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/SerDeUtils.java:33: cannot find symbol [javac] symbol : class BooleanObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.BooleanObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/SerDeUtils.java:35: cannot find symbol [javac] symbol : class DoubleObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.DoubleObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/SerDeUtils.java:36: cannot find symbol [javac] symbol : class FloatObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.FloatObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/SerDeUtils.java:39: cannot find symbol [javac] symbol : class ShortObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.ShortObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/SerDeUtils.java:40: cannot find symbol [javac] symbol : class StringObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/binarysortable/BinarySortableSerDe.java:44: cannot find symbol [javac] symbol : class BooleanObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.BooleanObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/binarysortable/BinarySortableSerDe.java:46: cannot find symbol [javac] symbol : class DoubleObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.DoubleObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/binarysortable/BinarySortableSerDe.java:47: cannot find symbol [javac] symbol : class FloatObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.FloatObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/binarysortable/BinarySortableSerDe.java:50: cannot find symbol [javac] symbol : class ShortObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.ShortObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/binarysortable/BinarySortableSerDe.java:51: cannot find symbol [javac] symbol : class StringObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazySimpleSerDe.java:43: cannot find symbol [javac] symbol : class PrimitiveObjectInspectorFactory [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorFactory; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/columnar/ColumnarSerDe.java:41: cannot find symbol [javac] symbol : class PrimitiveObjectInspectorFactory [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorFactory; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyStruct.java:26: cannot find symbol [javac] symbol : class LazySimpleStructObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.lazy.objectinspector [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.LazySimpleStructObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyStruct.java:39: cannot find symbol [javac] symbol: class LazySimpleStructObjectInspector [javac] LazyNonPrimitive<LazySimpleStructObjectInspector> { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyStruct.java:68: cannot find symbol [javac] symbol : class LazySimpleStructObjectInspector [javac] location: class org.apache.hadoop.hive.serde2.lazy.LazyStruct [javac] public LazyStruct(LazySimpleStructObjectInspector oi) { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type/DynamicSerDe.java:36: cannot find symbol [javac] symbol : class PrimitiveObjectInspectorFactory [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorFactory; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type/DynamicSerDe.java:37: cannot find symbol [javac] symbol : class PrimitiveObjectInspectorUtils [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorUtils; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type/DynamicSerDeTypeString.java:23: cannot find symbol [javac] symbol : class StringObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type/DynamicSerDeTypei16.java:23: cannot find symbol [javac] symbol : class ShortObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.ShortObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type/DynamicSerDeTypeDouble.java:23: cannot find symbol [javac] symbol : class DoubleObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.DoubleObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type/DynamicSerDeTypeBool.java:23: cannot find symbol [javac] symbol : class BooleanObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.objectinspector.primitive [javac] import org.apache.hadoop.hive.serde2.objectinspector.primitive.BooleanObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyBoolean.java:20: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyBooleanObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyBoolean.java:37: cannot find symbol [javac] symbol: class LazyBooleanObjectInspector [javac] LazyPrimitive<LazyBooleanObjectInspector, BooleanWritable> { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyBoolean.java:39: cannot find symbol [javac] symbol : class LazyBooleanObjectInspector [javac] location: class org.apache.hadoop.hive.serde2.lazy.LazyBoolean [javac] public LazyBoolean(LazyBooleanObjectInspector oi) { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyByte.java:21: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyByteObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyByte.java:37: cannot find symbol [javac] symbol: class LazyByteObjectInspector [javac] LazyPrimitive<LazyByteObjectInspector, ByteWritable> { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyByte.java:39: cannot find symbol [javac] symbol : class LazyByteObjectInspector [javac] location: class org.apache.hadoop.hive.serde2.lazy.LazyByte [javac] public LazyByte(LazyByteObjectInspector oi) { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyDouble.java:23: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyDoubleObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyDouble.java:31: cannot find symbol [javac] symbol: class LazyDoubleObjectInspector [javac] LazyPrimitive<LazyDoubleObjectInspector, DoubleWritable> { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyDouble.java:33: cannot find symbol [javac] symbol : class LazyDoubleObjectInspector [javac] location: class org.apache.hadoop.hive.serde2.lazy.LazyDouble [javac] public LazyDouble(LazyDoubleObjectInspector oi) { [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:25: cannot find symbol [javac] symbol : class LazyObjectInspectorFactory [javac] location: package org.apache.hadoop.hive.serde2.lazy.objectinspector [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.LazyObjectInspectorFactory; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:26: cannot find symbol [javac] symbol : class LazySimpleStructObjectInspector [javac] location: package org.apache.hadoop.hive.serde2.lazy.objectinspector [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.LazySimpleStructObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:27: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyBooleanObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:28: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyByteObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:29: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyDoubleObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:30: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyFloatObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:31: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyIntObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:32: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyLongObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:33: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyPrimitiveObjectInspectorFactory; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:34: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyShortObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFactory.java:35: package org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive does not exist [javac] import org.apache.hadoop.hive.serde2.lazy.objectinspector.primitive.LazyStringObjectInspector; [javac] ^ [javac] /export/home/mywork/hive/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyFloat.java:

    Read the article

  • SSL / HTTP / No Response to Curl

    - by Alex McHale
    I am trying to send commands to a SOAP service, and getting nothing in reply. The SOAP service is at a completely separate site from either server I am testing with. I have written a dummy script with the SOAP XML embedded. When I run it at my local site, on any of three machines -- OSX, Ubuntu, or CentOS 5.3 -- it completes successfully with a good response. I then sent the script to our public host at Slicehost, where I fail to get the response back from the SOAP service. It accepts the TCP socket and proceeds with the SSL handshake. I do not however receive any valid HTTP response. This is the case whether I use my script or curl on the command line. I have rewritten the script using SOAP4R, Net::HTTP and Curb. All of which work at my local site, none of which work at the Slicehost site. I have tried to assemble the CentOS box as closely to match my Slicehost server as possible. I rebuilt the Slice to be a stock CentOS 5.3 and stock CentOS 5.4 with the same results. When I look at a tcpdump of the bad sessions on Slicehost, I see my script or curl send the XML to the remote server, and nothing comes back. When I look at the tcpdump at my local site, I see the response just fine. I have entirely disabled iptables on the Slice. Does anyone have any ideas what could be causing these results? Please let me know what additional information I can furnish. Thank you! Below is a wire trace of a sample session. The IP that starts with 173 is my server while the IP that starts with 12 is the SOAP server's. No. Time Source Destination Protocol Info 1 0.000000 173.45.x.x 12.36.x.x TCP 36872 > https [SYN] Seq=0 Win=5840 Len=0 MSS=1460 TSV=137633469 TSER=0 WS=6 Frame 1 (74 bytes on wire, 74 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 0, Len: 0 No. Time Source Destination Protocol Info 2 0.040000 12.36.x.x 173.45.x.x TCP https > 36872 [SYN, ACK] Seq=0 Ack=1 Win=8760 Len=0 MSS=1460 Frame 2 (62 bytes on wire, 62 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 0, Ack: 1, Len: 0 No. Time Source Destination Protocol Info 3 0.040000 173.45.x.x 12.36.x.x TCP 36872 > https [ACK] Seq=1 Ack=1 Win=5840 Len=0 Frame 3 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 1, Ack: 1, Len: 0 No. Time Source Destination Protocol Info 4 0.050000 173.45.x.x 12.36.x.x SSLv2 Client Hello Frame 4 (156 bytes on wire, 156 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 1, Ack: 1, Len: 102 Secure Socket Layer No. Time Source Destination Protocol Info 5 0.130000 12.36.x.x 173.45.x.x TCP [TCP segment of a reassembled PDU] Frame 5 (1434 bytes on wire, 1434 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 1, Ack: 103, Len: 1380 Secure Socket Layer No. Time Source Destination Protocol Info 6 0.130000 173.45.x.x 12.36.x.x TCP 36872 > https [ACK] Seq=103 Ack=1381 Win=8280 Len=0 Frame 6 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 103, Ack: 1381, Len: 0 No. Time Source Destination Protocol Info 7 0.130000 12.36.x.x 173.45.x.x TLSv1 Server Hello, Certificate, Server Hello Done Frame 7 (1280 bytes on wire, 1280 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 1381, Ack: 103, Len: 1226 [Reassembled TCP Segments (2606 bytes): #5(1380), #7(1226)] Secure Socket Layer No. Time Source Destination Protocol Info 8 0.130000 173.45.x.x 12.36.x.x TCP 36872 > https [ACK] Seq=103 Ack=2607 Win=11040 Len=0 Frame 8 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 103, Ack: 2607, Len: 0 No. Time Source Destination Protocol Info 9 0.130000 173.45.x.x 12.36.x.x TLSv1 Client Key Exchange, Change Cipher Spec, Encrypted Handshake Message Frame 9 (236 bytes on wire, 236 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 103, Ack: 2607, Len: 182 Secure Socket Layer No. Time Source Destination Protocol Info 10 0.190000 12.36.x.x 173.45.x.x TLSv1 Change Cipher Spec, Encrypted Handshake Message Frame 10 (97 bytes on wire, 97 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 2607, Ack: 285, Len: 43 Secure Socket Layer No. Time Source Destination Protocol Info 11 0.190000 173.45.x.x 12.36.x.x TLSv1 Application Data Frame 11 (347 bytes on wire, 347 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 285, Ack: 2650, Len: 293 Secure Socket Layer No. Time Source Destination Protocol Info 12 0.190000 173.45.x.x 12.36.x.x TCP [TCP segment of a reassembled PDU] Frame 12 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer No. Time Source Destination Protocol Info 13 0.450000 12.36.x.x 173.45.x.x TCP https > 36872 [ACK] Seq=2650 Ack=578 Win=64958 Len=0 Frame 13 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 2650, Ack: 578, Len: 0 No. Time Source Destination Protocol Info 14 0.450000 173.45.x.x 12.36.x.x TCP [TCP segment of a reassembled PDU] Frame 14 (206 bytes on wire, 206 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 2038, Ack: 2650, Len: 152 No. Time Source Destination Protocol Info 15 0.510000 12.36.x.x 173.45.x.x TCP [TCP Dup ACK 13#1] https > 36872 [ACK] Seq=2650 Ack=578 Win=64958 Len=0 Frame 15 (54 bytes on wire, 54 bytes captured) Ethernet II, Src: Dell_fb:49:a1 (00:21:9b:fb:49:a1), Dst: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6) Internet Protocol, Src: 12.36.x.x (12.36.x.x), Dst: 173.45.x.x (173.45.x.x) Transmission Control Protocol, Src Port: https (443), Dst Port: 36872 (36872), Seq: 2650, Ack: 578, Len: 0 No. Time Source Destination Protocol Info 16 0.850000 173.45.x.x 12.36.x.x TCP [TCP Retransmission] [TCP segment of a reassembled PDU] Frame 16 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer No. Time Source Destination Protocol Info 17 1.650000 173.45.x.x 12.36.x.x TCP [TCP Retransmission] [TCP segment of a reassembled PDU] Frame 17 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer No. Time Source Destination Protocol Info 18 3.250000 173.45.x.x 12.36.x.x TCP [TCP Retransmission] [TCP segment of a reassembled PDU] Frame 18 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer No. Time Source Destination Protocol Info 19 6.450000 173.45.x.x 12.36.x.x TCP [TCP Retransmission] [TCP segment of a reassembled PDU] Frame 19 (1514 bytes on wire, 1514 bytes captured) Ethernet II, Src: 40:40:17:3a:f4:e6 (40:40:17:3a:f4:e6), Dst: Dell_fb:49:a1 (00:21:9b:fb:49:a1) Internet Protocol, Src: 173.45.x.x (173.45.x.x), Dst: 12.36.x.x (12.36.x.x) Transmission Control Protocol, Src Port: 36872 (36872), Dst Port: https (443), Seq: 578, Ack: 2650, Len: 1460 Secure Socket Layer

    Read the article

  • java.lang.Error: "Not enough storage is available to process this command" when generating images

    - by jhericks
    I am running a web application on BEA Weblogic 9.2. Until recently, we were using JDK 1.5.0_04, with JAI 1.1.2_01 and Image IO 1.1. In some circumstances (we never figured out exactly why), when we were processing large images (but not that large - a few MB), the JVM would crash without any error message or stack trace or anything. This didn't happen much in production, but enough to be a nuisance and eventually we were able to reproduce it. We decided to switch to JRockit90 1.5.0_04 and we were no longer able to reproduce the problem in our test environment, so we thought we had it licked. Now, however, after the application server has been up for a while, we start getting the error message, "Not enough storage is available to process this command" during image operations. For example: java.lang.Error: Error starting thread: Not enough storage is available to process this command. at java.lang.Thread.start()V(Unknown Source) at sun.awt.image.ImageFetcher$1.run(ImageFetcher.java:279) at sun.awt.image.ImageFetcher.createFetchers(ImageFetcher.java:272) at sun.awt.image.ImageFetcher.add(ImageFetcher.java:55) at sun.awt.image.InputStreamImageSource.startProduction(InputStreamImageSource.java:149) at sun.awt.image.InputStreamImageSource.addConsumer(InputStreamImageSource.java:106) at sun.awt.image.InputStreamImageSource.startProduction(InputStreamImageSource.java:144) at sun.awt.image.ImageRepresentation.startProduction(ImageRepresentation.java:647) at sun.awt.image.ImageRepresentation.prepare(ImageRepresentation.java:684) at sun.awt.SunToolkit.prepareImage(SunToolkit.java:734) at java.awt.Component.prepareImage(Component.java:3073) at java.awt.ImageMediaEntry.startLoad(MediaTracker.java:906) at java.awt.MediaEntry.getStatus(MediaTracker.java:851) at java.awt.ImageMediaEntry.getStatus(MediaTracker.java:902) at java.awt.MediaTracker.statusAll(MediaTracker.java:454) at java.awt.MediaTracker.waitForAll(MediaTracker.java:405) at java.awt.MediaTracker.waitForAll(MediaTracker.java:375) at SfxNET.System.Drawing.ImageLoader.loadImage(Ljava.awt.Image;)Ljava.awt.image.BufferedImage;(Unknown Source) at SfxNET.System.Drawing.ImageLoader.loadImage(Ljava.net.URL;)Ljava.awt.image.BufferedImage;(Unknown Source) at Resources.Tools.Commands.W$zw(Ljava.lang.ClassLoader;)V(Unknown Source) at Resources.Tools.Commands.getContents()[[Ljava.lang.Object;(Unknown Source) at SfxNET.sfxUtils.SfxResourceBundle.handleGetObject(Ljava.lang.String;)Ljava.lang.Object;(Unknown Source) at java.util.ResourceBundle.getObject(ResourceBundle.java:320) at SoftwareFX.internal.ChartFX.wxvw.yxWW(Ljava.lang.String;Z)Ljava.lang.Object;(Unknown Source) at SoftwareFX.internal.ChartFX.wxvw.vxWW(Ljava.lang.String;)Ljava.lang.Object;(Unknown Source) at SoftwareFX.internal.ChartFX.CommandBar.YWww(LSoftwareFX.internal.ChartFX.wxvw;IIII)V(Unknown Source) at SoftwareFX.internal.ChartFX.Internet.Server.xxvw.YzzW(LSoftwareFX.internal.ChartFX.Internet.Server.ChartCore;Z)LSoftwareFX.internal.ChartFX.CommandBar;(Unknown Source) at SoftwareFX.internal.ChartFX.Internet.Server.xxvw.XzzW(LSoftwareFX.internal.ChartFX.Internet.Server.ChartCore;)V(Unknown Source) at SoftwareFX.internal.ChartFX.Internet.Server.ChartCore.OnDeserialization(Ljava.lang.Object;)V(Unknown Source) at SoftwareFX.internal.ChartFX.Internet.Server.ChartCore.Zvvz(LSoftwareFX.internal.ChartFX.Base.wzzy;)V(Unknown Source) Has anyone seen something like this before? Any clue what might be happening?

    Read the article

  • MalformedByteSequenceException while trying to pars XML

    - by poeschlorn
    Hey guy, maybe someone can help: I have the following .gpx data from wikipedia: <?xml version="1.0" encoding="UTF-8" standalone="no" ?> <gpx xmlns="http://www.topografix.com/GPX/1/1" creator="byHand" version="1.1" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.topografix.com/GPX/1/1 http://www.topografix.com/GPX/1/1/gpx.xsd"> <wpt lat="39.921055008" lon="3.054223107"> <ele>12.863281</ele> <time>2005-05-16T11:49:06Z</time> <name>Cala Sant Vicenç - Mallorca</name> <sym>City</sym> </wpt> </gpx> When I call my parsing method, I get a exception (see below) The call looks like this: Document tmpDoc = getParsedXML(currentGPX); My method to parse looks like this (standart parsing code, nothing exctiting....): public static Document getParsedXML(String fileWithPath){ DocumentBuilderFactory dbf = DocumentBuilderFactory.newInstance(); DocumentBuilder db; Document doc = null; try { db = dbf.newDocumentBuilder(); doc = db.parse(new File(fileWithPath)); } catch (ParserConfigurationException e) { e.printStackTrace(); } catch (SAXException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } return doc; } This simple code throws following exception: com.sun.org.apache.xerces.internal.impl.io.MalformedByteSequenceException: Invalid byte 2 of 3-byte UTF-8 sequence. at com.sun.org.apache.xerces.internal.impl.io.UTF8Reader.invalidByte(Unknown Source) at com.sun.org.apache.xerces.internal.impl.io.UTF8Reader.read(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLEntityScanner.load(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLEntityScanner.skipChar(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl$FragmentContentDriver.next(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(Unknown Source) at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source) at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(Unknown Source) at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(Unknown Source) at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(Unknown Source) at com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(Unknown Source) at com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(Unknown Source) at javax.xml.parsers.DocumentBuilder.parse(Unknown Source) at Zeugs.getParsedXML(Zeugs.java:38) at Zeugs.main(Zeugs.java:25) I guess the error lies within the format of the first file, but I don't know where exactly. Can you please give me a hint?

    Read the article

  • Source code versioning with comments (organizational practice) - leave or remove?

    - by ADTC
    Before you start admonishing me with "DON'T DO IT," "BAD PRACTICE!" and "Learn to use proper source code control", please hear me out first. I am fully aware that the practice of commenting out old code and leaving it there forever is very bad and I hate such practice myself. But here's the situation I'm in. A few months ago I joined a company as software developer. I had worked in the company for few months as an intern, about a year before joining recently. Our company uses source code version control (CVS) but not properly. Here's what happened both in my internship and my current permanent position. Each time I was assigned to work on a project (legacy, about 8-10 years old). Instead of creating a CVS account and letting me check out code and check in changes, a senior colleague exported the code from CVS, zipped it up and passed it to me. While this colleague checks in all changes in bulk every few weeks, our usual practice is to do fine-grained versioning in the actual source code itself (each file increments in versions independent from the rest). Whenever a change is made to a file, old code is commented out, new code entered below it, and this whole section is marked with a version number. Finally a note about the changes is placed at the top of the file in a section called Modification History. Finally the changed files are placed in a shared folder, ready and waiting for the bulk check-in. /* * Copyright notice blah blah * Some details about file (project name, file name etc) * Modification History: * Date Version Modified By Description * 2012-10-15 1.0 Joey Initial creation * 2012-10-22 1.1 Chandler Replaced old code with new code */ code .... //v1.1 start //old code new code //v1.1 end code .... Now the problem is this. In the project I'm working on, I needed to copy some new source code files from another project (new in the sense that they didn't exist in destination project before). These files have a lot of historical commented out code and comment-based versioning including usually long or very long Modification History section. Since the files are new to this project I decided to clean them up and remove unnecessary code including historical code, and start fresh at version 1.0. (I still have to continue the practice of comment-based versioning despite hating it. And don't ask why not start at version 0.1...) I have done similar something during my internship and no one said anything. My supervisor has seen the work a few times and didn't say I shouldn't do such clean-up (if at all it was noticed). But a same-level colleague saw this and said it's not recommended as it may cause downtime in the future and increase maintenance costs. An example is when changes are made in another project on the original files and these changes need to be propagated to this project. With code files drastically different, it could cause confusion to an employee doing the propagation. It makes sense to me, and is a valid point. I couldn't find any reason to do my clean-up other than the inconvenience of a ridiculously messy code. So, long story short: Given the practice in our company, should I not do such clean-up when copying new files from project to project? Is it better to make changes on the (copy of) original code with full history in comments? Or what justification can I give for doing the clean-up? PS to mods: Hope you allow this question some time even if for any reason you determine it to be unfit in SO. I apologize in advance if anything is inappropriate including tags.

    Read the article

  • How do I make a firefox extension execute script on page open/load? [migrated]

    - by Will Mc
    Thanks in advance! I am creating my first extension (A firefox extension). See below for full description of final product. I need help starting off. I have looked and studied the HelloWorld.xpi example found on Mozilla's site so I am happy to edit that to learn. In the example, when you click a menu item it runs script to display an alert message. My question is, how would I edit this extension to run the script on page load? I am guessing I need to insert some code in the browserOverlay as it loads on page load so, here is the browserOverlay.xpi from the example I am editing to learn: <?xml version="1.0"?> <?xml-stylesheet type="text/css" href="chrome://global/skin/" ?> <?xml-stylesheet type="text/css" href="chrome://xulschoolhello/skin/browserOverlay.css" ?> <!DOCTYPE overlay SYSTEM "chrome://xulschoolhello/locale/browserOverlay.dtd"> <overlay id="xulschoolhello-browser-overlay" xmlns="http://www.mozilla.org/keymaster/gatekeeper/there.is.only.xul"> <script type="application/x-javascript" src="chrome://xulschoolhello/content/browserOverlay.js" /> <stringbundleset id="stringbundleset"> <stringbundle id="xulschoolhello-string-bundle" src="chrome://xulschoolhello/locale/browserOverlay.properties" /> </stringbundleset> <menubar id="main-menubar"> <menu id="xulschoolhello-hello-menu" label="&xulschoolhello.hello.label;" accesskey="&xulschoolhello.helloMenu.accesskey;" insertafter="helpMenu"> <menupopup> <menuitem id="xulschoolhello-hello-menu-item" label="&xulschoolhello.hello.label;" accesskey="&xulschoolhello.helloItem.accesskey;" oncommand="XULSchoolChrome.BrowserOverlay.sayHello(event);" /> </menupopup> </menu> </menubar> <vbox id="appmenuSecondaryPane"> <menu id="xulschoolhello-hello-menu-2" label="&xulschoolhello.hello.label;" accesskey="&xulschoolhello.helloMenu.accesskey;" insertafter="appmenu_addons"> <menupopup> <menuitem id="xulschoolhello-hello-menu-item-2" label="&xulschoolhello.hello.label;" accesskey="&xulschoolhello.helloItem.accesskey;" oncommand="XULSchoolChrome.BrowserOverlay.sayHello(event);" /> </menupopup> </menu> </vbox> </overlay> I hope you can help me. I need to know what code I should use and where I should put it... Here is the gist of my overall extension - I am creating a click to call extension. This extension will search any new page for a phone number whether just refreshed, new page, new tab etc... Each phone number when clicked will open a new tab and direct user to a custom URL. Thanks again!

    Read the article

  • is there a such thing as a randomly accessible pseudo-random number generator? (preferably open-sour

    - by lucid
    first off, is there a such thing as a random access random number generator, where you could not only sequentially generate random numbers as we're all used to, assuming rand100() always generates a value from 0-100: for (int i=0;i<5;i++) print rand100() output: 14 75 36 22 67 but also randomly access any random value like: rand100(0) would output 14 as long as you didn't change the seed rand100(3) would always output 22 rand100(4) would always output 67 and so on... I've actually found an open-source generator algorithm that does this, but you cannot change the seed. I know that pseudorandomness is a complex field; I wouldn't know how to alter it to add that functionality. Is there a seedable random access random number generator, preferably open source? or is there a better term for this I can google for more information? if not, part 2 of my question would be, is there any reliably random open source conventional seedable pseudorandom number generator so I could port it to multiple platforms/languages while retaining a consistent sequence of values for each platform for any given seed?

    Read the article

  • How do I prove I should put a table of values in source code instead of a database table?

    - by FastAl
    <tldr>looking for a reference to a book or other undeniably authoritative source that gives reasons when you should choose a database vs. when you should choose other storage methods. I have provided an un-authoritative list of reasons about 2/3 of the way down this post.</tldr> I have a situation at my company where a database is being used where it would be better to use another solution (in this case, an auto-generated piece of source code that contains a static lookup table, searched by binary sort). Normally, a database would be an OK solution even though the problem does not require a database, e.g, none of the elements of ACID are needed, as it is read-only data, updated about every 3-5 years (also requiring other sourcecode changes), and fits in memory, and can be keyed into via binary search (a tad faster than db, but speed is not an issue). The problem is that this code runs on our enterprise server, but is shared with several PC platforms (some disconnected, some use a central DB, etc.), and parts of it are managed by multiple programming units, parts by the DBAs, parts even by mathematicians in another department, etc. These hit their own platform’s version of their databases (containing their own copy of the static data). What happens is that every implementation, every little change, something different goes wrong. There are many other issues as well. I can’t even use a flatfile, because one mode of running on our enterprise server does not have permission to read files (only databases, and of course, its own literal storage, e.g., in-source table). Of course, other parts of the system use databases in proper, less obscure manners; there is no problem with those parts. So why don’t we just change it? I don’t have administrative ability to force a change. But I’m affected because sometimes I have to help fix the problems, but mostly because it causes outages and tons of extra IT time by other programmers and d*mmit that makes me mad! The reason neither management, nor the designers of the system, can see the problem is that they propose a solution that won’t work: increase communication; implement more safeguards and standards; etc. But every time, in a different part of the already-pared-down but still multi-step processes, a few different diligent, hard-working, top performing IT personnel make a unique subtle error that causes it to fail, sometimes after the last round of testing! And in general these are not single-person failures, but understandable miscommunications. And communication at our company is actually better than most. People just don't think that's the case because they haven't dug into the matter. However, I have it on very good word from somebody with extensive formal study of sociology and psychology that the relatively small amount of less-than-proper database usage in this gigantic cross-platform multi-source, multi-language project is bureaucratically un-maintainable. Impossible. No chance. At least with Human Beings in the loop, and it can’t be automated. In addition, the management and developers who could change this, though intelligent and capable, don’t understand the rigidity of this ‘how humans are’ issue, and are not convincible on the matter. The reason putting the static data in sourcecode will solve the problem is, although the solution is less sexy than a database, it would function with no technical drawbacks; and since the sharing of sourcecode already works very well, you basically erase any database-related effort from this section of the project, along with all the drawbacks of it that are causing problems. OK, that’s the background, for the curious. I won’t be able to convince management that this is an unfixable sociological problem, and that the real solution is coding around these limits of human nature, just as you would code around a bug in a 3rd party component that you can’t change. So what I have to do is exploit the unsuitableness of the database solution, and not do it using logic, but rather authority. I am aware of many reasons, and posts on this site giving reasons for one over the other; I’m not looking for lists of reasons like these (although you can add a comment if I've miss a doozy): WHY USE A DATABASE? instead of flatfile/other DB vs. file: if you need... Random Read / Transparent search optimization Advanced / varied / customizable Searching and sorting capabilities Transaction/rollback Locks, semaphores Concurrency control / Shared users Security 1-many/m-m is easier Easy modification Scalability Load Balancing Random updates / inserts / deletes Advanced query Administrative control of design, etc. SQL / learning curve Debugging / Logging Centralized / Live Backup capabilities Cached queries / dvlp & cache execution plans Interleaved update/read Referential integrity, avoid redundant/missing/corrupt/out-of-sync data Reporting (from on olap or oltp db) / turnkey generation tools [Disadvantages:] Important to get right the first time - professional design - but only b/c it's meant to last s/w & h/w cost Usu. over a network, speed issue (best vs. best design vs. local=even then a separate process req's marshalling/netwk layers/inter-p comm) indicies and query processing can stand in the way of simple processing (vs. flatfile) WHY USE FLATFILE: If you only need... Sequential Row processing only Limited usage append only (no reading, no master key/update) Only Update the record you're reading (fixed length recs only) Too big to fit into memory If Local disk / read-ahead network connection Portability / small system Email / cut & Paste / store as document by novice - simple format Low design learning curve but high cost later WHY USE IN-MEMORY/TABLE (tables, arrays, etc.): if you need... Processing a single db/ff record that was imported Known size of data Static data if hardcoding the table Narrow, unchanging use (e.g., one program or proc) -includes a class that will be shared, but encapsulates its data manipulation Extreme speed needed / high transaction frequency Random access - but search is dependent on implementation Following are some other posts about the topic: http://stackoverflow.com/questions/1499239/database-vs-flat-text-file-what-are-some-technical-reasons-for-choosing-one-over http://stackoverflow.com/questions/332825/are-flat-file-databases-any-good http://stackoverflow.com/questions/2356851/database-vs-flat-files http://stackoverflow.com/questions/514455/databases-vs-plain-text/514530 What I’d like to know is if anybody could recommend a hard, authoritative source containing these reasons. I’m looking for a paper book I can buy, or a reputable website with whitepapers about the issue (e.g., Microsoft, IBM), not counting the user-generated content on those sites. This will have a greater change to elicit a change that I’m looking for: less wasted programmer time, and more reliable programs. Thanks very much for your help. You win a prize for reading such a large post!

    Read the article

  • CodePlex Daily Summary for Friday, May 14, 2010

    CodePlex Daily Summary for Friday, May 14, 2010New ProjectsCampfire#: Campfire# is a campfire client written in .NET 4.0 using WPF, which uses the Campfire API.CHESS: Systematic Concurrency Testing: CHESS is a tool for systematic and disciplined concurrency testing. Given a concurrent test, CHESS systematically enumerates the possible thread sc...cmpp: cmppcycloid: Arcanoid gameDotNetNuke® C#: The DotNetNuke® project is developed and maintained on a Visual Basic codebase, however a C# version has always been a popular request. This is a ...EasyBuildingCMS.NET: EasyBuildingCMS is an easy use content management system.fluidCMS: Provide for flexible management of web content that is not tightly integrated with the layout and rendering of sites that consume the content.Golem: An automation tool oriented to localization engineering environmentHB Batch Encoder Mk 2: HandBrake Batch Encoder Mk II This Program was adapted from an original project downloaded from codeplex by the name of "Handbrake Batch Encoder"...Integrating Social Media Networks: This is part of my pos graduation project.Ketonic: The Ketonic project aims to improve development of websites based on the Kentico CMS. LinkSharp: LinkSharp is a short-URL provider that can be used to generate short static non changing URL's. The web interface allows you to easily add / edit /...PUC NET (C++ Network Library - PUC Minas): This is an Academic Library for an Easy Development of Applications and Games based on Network Communication.Regular Expression Tester: Small utility for testing regular expressionsSharePoint User Management WebPart: SharePoint User Management WebPartSharpBox: SharpBox makes it easier for .NET developers to interact with existing cloud storage service, e.g. DropBox or Amazon S3Snipivit: Snipivit is a snippet manager service and VS2010 plugin that allows small development teams to store all their code snippets on a central database,...Software Factories Applied: Software Factories Applied is a project collecting the companion bits for the eponymous book to be published by Wiley & Sons in 2011. The authors ...The Ping Master: A service that periodically pings network addresses and allows the running of command line type utilities in response to success or failure.Title Safe Region Checker: A simple utility for XNA developers to check screenshots from games intended for release on the LIVE Marketplace for "title safe" region compliance...Trial project: sky is blueUyghur Named Date: Generate Uyghur named date string. ئۇيغۇرچە ئاي ناملىق چىسلا ھاسىل قىلىشWildcard Search Web Part for SharePoint 2010: The Wildcard Search web part for MOSS 2007 was wildly successful. Although, SharePoint 2010 has built-in wildcard searching functionality, the out...在线Office控件 Online Offical Control: 在线Office控件软件作品发布平台: SoftwarePublishPlatform 软件作品发布平台New ReleasesDemina: Demina Binaries version 0.1: Demina binaries are now available. This release (version 0.1) is an alpha version. Please report any bugs for extermination.EasyTFS: EasyTfs 1.0 Beta 2: Added cache refreshing when contents are updated rather than just every 10 minutes. Added window title based on currently-open case. Added attachme...Extending C# editor - Outlining, classification: Initial release: Initial releaseHB Batch Encoder Mk 2: HB Batch Encoder Mk2 v1.01: Binary release files.HB Batch Encoder Mk 2: Source Code: Source CodeHobbyBrew Mobile: Beta 2: Corretti numerosi bug, data un implementazione "approssimativa" del riscaldamento per Infusione. Aggiornamento consigliato!HouseFly controls: HouseFly controls beta 1.0.2.0: HouseFly controls relase 1.0.2.0 betaHtml Reader: Beta 2: I fixed a bug in HtmlElementCollection, Which exposed an integer enumerator, instead of enumerating through HtmlElements. I added a WPF Window tha...Html to OpenXml: HtmlToOpenXml 1.2: Fix some reported bug. See change set for description. The dll library to include in your project. The dll is signed for GAC support. Compiled wi...Infection Protection: Infection Protection 0.1: This is the final version of Infection Protection that was entered into the 2010 OGPC game competition.Jobping Url Shortener: Deploy Code 0.5.1: Deployment code for Version 0.5 This version includes our Jobping style.Jobping Url Shortener: Source Code 0.5.1: Source code for the 0.5 release. This release includes our Jobping style skin.Kooboo HTML form: Kooboo HTML form module 2.1.0.1: HTML form module contributed by member aledelgo. Add SMTP user and password authentication.KooBoo Image Galery: Beta 2: This new version corrects some issues pointed by Guoqi Zheng Some schema and folders were renamed, so it's better to uninstall the module and remo...MFCMAPI: May 2010 Release: If you just want to run the tool, get the executable. If you want to debug it, get the symbol file and the source. Build: 6.0.0.1020 The 64 bit bu...MVC Turbine: Release 2.1 for MVC2: This RTM contains the same features as v2.0 RTM plus these features: Instance Registration to IServiceLocator You can now add an instance of a typ...NazTek.Extension.Clr4: NazTek.Extension.Clr4 Binary: Binary releaseNazTek.Extension.Clr4: NazTek.Extension.Clr4 Source: Cab with source codeNSIS Autorun: NSIS Autorun 0.1.8: This release includes source code, executable binaries and example materials.Ottawa IT Day: 2010 Source Code and Presentations: During the Ottawa IT Day 2010, some of the presenters shared their code (and some presentations). This release is the culmination of all those effo...PHPWord: PHPWord 0.6.1 Beta: Changelog: Fixed Error when adding a JPEG image and opening in office 2007 Issue #1 Fixed Already defined constant PHPWORD_BASE_PATH Issue #2 F...Rapid Dictionary: Rapid Dictionary Alpha 2.0: Release Notes * Try auto updatable version: http://install.rapiddict.com/index.html Rapid Dictionary Alpha 2.0 includes such functionality: ...Shake - C# Make: Shake v0.1.18: Core changes. Process wrapper class, console logger, etc.SharpBox: SharpBox-Trunk: This is the SharpBox build from the trunk source branch!SharpBox: SharpBox-Trunk-Initial-Source: The initial source code, will be updated from time to timeSpackle.NET: 4.0.0.0 Release: This new drop contains the following A CreateBigInteger() method on SecureRandom to create random BigInteger values. An extension method to prop...StreamInsight example queries, input adapters and output adapters: StreamInsight Examples for V1.0 RTM: Zipped source code.The Ping Master: v0.1.0.0: Early release of The Ping Master for test purposes. Configuration tool is unfinished and does not include an installer.Title Safe Region Checker: Title Safe Region Checker v1.0.0.1: Release 1.0 of Title Safe Region Checker. No known bugs or problems. File is a zipped directory containing the necessary installation files.TortoiseHg: TortoiseHg 1.0.3: This is a bug fix release, we recommend all users upgrade to 1.0.3Usa*Usa Libraly: Smart.Windows.Navigation 0.4: Smart.Windows.Navigation simple navigation library ver 0.4.0. Include Windows Forms & Compact Framework samples. Information - Smart.Windows.Mvc ...VCC: Latest build, v2.1.30513.0: Automatic drop of latest buildWabbitStudio Z80 Software Tools: Wabbitcode: Wabbitcode is an Z80 Assembly IDE for Windows, OS X, and Linux. Built to take full advantage of the features of SPASM and Wabbitemu, Wabbitcode has...white: Release 0.20: Source Code: https://white-project.googlecode.com/svn/tags/0.20 Add few more keyboard keys like windows button and F13-F24. Fixed bugs for keyboar...Wildcard Search Web Part for SharePoint 2010: Version 1.0 Release 1: This is the initial release of the Wildcard Search Web Part for SharePoint 2010. All queries will be issued as wildcards unless disabled with the ...Windows Azure Command-line Tools for PHP Developers: Windows Azure Command-line Tools May 2010 Update: May 2010 Update – May 13, 2010 We are pleased to announced the May 2010 update of Windows Azure Command-Line Tools. In addition to bug fixes and i...WinXmlCook: WinXmlCook 2.1: Version 2.1 released!Xrns2XMod: Xrns2XMod 1.1: some source code optimization在线Office控件 Online Offical Control: SPOffice2.0Release: 该版本在MS Office2003/2007,WPS2009,WPS2010下测试通过Most Popular ProjectsRawrWBFS ManagerAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight ToolkitWindows Presentation Foundation (WPF)patterns & practices – Enterprise LibraryMicrosoft SQL Server Community & SamplesPHPExcelASP.NETMost Active Projectspatterns & practices – Enterprise LibraryMirror Testing SystemRawrBlogEngine.NETPHPExcelMicrosoft Biology FoundationwhiteWindows Azure Command-line Tools for PHP DevelopersStyleCopShake - C# Make

    Read the article

  • Get the onended event for an AudioBuffer in HTML5/Chrome

    - by Matthew James Davis
    So I am playing audio file in Chrome and I want to detect when playing has ended so I can delete references to it. Here is my code var source = context.createBufferSource(); source.buffer = sound.buffer; source.loop = sound.loop; source.onended = function() { delete playingSounds[soundName]; } source.connect(mainNode); source.start(0, sound.start, sound.length); however, the event handler doesn't fire. Is this not yet supported as described by the W3 specification? Or am I doing something wrong?

    Read the article

< Previous Page | 220 221 222 223 224 225 226 227 228 229 230 231  | Next Page >