Search Results

Search found 18160 results on 727 pages for 'jdk 7 feature complete'.

Page 83/727 | < Previous Page | 79 80 81 82 83 84 85 86 87 88 89 90  | Next Page >

  • django-social-auth for Facebook is redirecting home and not logging in

    - by Scott Rogowski
    I have had django-social-auth working for Google for quite some time now but am having problems with Facebook. I am at the point where clicking on the /login/facebook/ link will take me to the Facebook authorization page. I then click "go to app" and it redirects me to my home page but does not log in or create a user but does put some strange "#=" onto the back of my URL. Reading up on that, here https://developers.facebook.com/blog/post/552/, and here https://github.com/omab/django-social-auth/issues/199, it seems that would be happening if the redirect uri was not defined. However, on my facebook app settings, I have the following (replacing my site with example.com): + App Namespace: "example" + Site URL: "http://example.com/complete/facebook/" + Site Domain: "example.com" + Sandbox Mode: "On" + Post-Authorize Redirect URL: "http://apps.facebook.com/example/" + Deauthorize URL: "http://www.example.com/" + Post-Authorize URL: "http://example.com/complete/facebook/" The request that django-social-auth is sending to facebook is (replacing my info again): "https://www.facebook.com/dialog/oauth?scope=email&state=*&redirect_uri=http%3A%2F%2Fexample.com%2Fcomplete%2Ffacebook%2F%3Fredirect_state%3D***&client_id=*" The /complete/facebook/ is what is in the documentation and google works as /complete/google/ What am I missing here?

    Read the article

  • SharpDevelop WIX project: MSBuild Configurations

    - by chezy525
    Using SharpDevelop, I wrote a windows service with a WIX setup project to install/auto-start it. For testing purposes, I've done a number of things I don't want to do in the release version (i.e. add an uninstall shortcut to the desktop). So, my question really boils down to this; how do you handle build configurations within a WiX project? I think I've solved most of my problems after I found this question Passing build parameters to .wxs file to dynamicaly build wix installers. And thus far I've done the following: Added a property that checks the Configuration variable <Product> ... <Property Id="DEBUG">$(var.Configuration) == 'Debug'</Property> ... Separated all of the debug files into unique components and setup as a separate feature with a condition checking the DEBUG property. <Product> ... <Feature> ... <Feature Id="DebugFiles" Level="1"> <ComponentRef Id="UninstallShortcutComponent" /> <Condition Level="0">DEBUG</Condition> </Feature> ... Then, finally, pointing to the correct file based on the configuration, using the Configuration variable <Directory> ... <Component> <File Source="..\mainProject\bin\$(var.Configuration)\main.exe" /> </Component> ... So, now my question is simplified to how to handle files that may not exist under certain build configurations (like .pdb files). Using all of the above (including pointing the file source to the ...\bin\Release\*.pdb, which I know isn't expected to exist) I get a LGHT0103 compiler error, it can't find the file.

    Read the article

  • Choosing a W3C valid DOCTYPE and charset combination?

    - by George Carter
    I have a homepage with the following: <DOCTYPE html> <meta http-equiv="Content-Type" content="text/html; charset=utf-8"> My choice of the DOCTYPE "html" is based on a recommendation for html pages using jQuery. My choice of charset=utf=8 is based on a recommendation to make my pages readable on most browsers. But these choices may be wrong. When I run this page thru the W3C HTML validator, I get messages you see below. Any way I can eliminate the 2 errors? ! Using experimental feature: HTML5 Conformance Checker. The validator checked your document with an experimental feature: HTML5 Conformance Checker. This feature has been made available for your convenience, but be aware that it may be unreliable, or not perfectly up to date with the latest development of some cutting-edge technologies. If you find any issue with this feature, please report them. Thank you. Validation Output: 2 Errors 1. Error Line 18, Column 70: Changing character encoding utf-8 and reparsing. …ntent-Type" content="text/html; charset=utf-8"> 2. Error Line 18, Column 70: Changing encoding at this point would need non-streamable behavior. …ntent-Type" content="text/html; charset=utf-8">

    Read the article

  • jQuery sliding animation not working

    - by Jake Zeitz
    I have three divs stacked on each other but offset so that a part of each div is visible. When one of the bottom divs is clicked I want the top div to animate out and back into the stack at the bottom, then the div that is clicked will appear at the top. So far I only have the code for when the middle div is clicked, but I cannot get it to work properly. What am I doing wrong? (I also realize that the code I wrote is probably terrible, this is the first jQuery code I have written.) The css is very very simple: .first { z-index: 3; } .second { z-index: 2; } .third { z-index: 1; } The basic html is this: <div class="first"></div> <div class="second"></div> <div class="third"></div> Here is my code: $("div.second").click(function () { $("div.first").animate({ left: "-=200px"}, {duration: "fast", complete: function () { $("div.first").removeClass("first").addClass("third").animate({left: "+=350px", top: "+=60px"}, "fast"); } }); $("div.second").animate({ left: "-=24px", top: "-=30px"}, {duration: "fast", complete: function () { $("div.second").removeClass("second").addClass("first"); } }); $("div.third").animate({ left: "-=24px", top: "-=30px"}, {duration: "fast", complete: function () { $("div.third").removeClass("third").addClass("second"); } }); }); I can get the div.first to move to the side and back. But now I can't get the classes to stay changed. What keeps happening is the div.second will remove it's class and add .first in the animation, but when the animation is complete, it acts like it still has a class of .second.

    Read the article

  • Boost shared_ptr use_count function

    - by photo_tom
    My application problem is the following - I have a large structure foo. Because these are large and for memory management reasons, we do not wish to delete them when processing on the data is complete. We are storing them in std::vector<boost::shared_ptr<foo>>. My question is related to knowing when all processing is complete. First decision is that we do not want any of the other application code to mark a complete flag in the structure because there are multiple execution paths in the program and we cannot predict which one is the last. So in our implementation, once processing is complete, we delete all copies of boost::shared_ptr<foo>> except for the one in the vector. This will drop the reference counter in the shared_ptr to 1. Is it practical to use shared_ptr.use_count() to see if it is equal to 1 to know when all other parts of my app are done with the data. One additional reason I'm asking the question is that the boost documentation on the shared pointer shared_ptr recommends not using "use_count" for production code.

    Read the article

  • Installing Lubuntu 14.04.1 forcepae fails

    - by Rantanplan
    I tried to install Lubuntu 14.04.1 from a CD. First, I chose Try Lubuntu without installing which gave: ERROR: PAE is disabled on this Pentium M (PAE can potentially be enabled with kernel parameter "forcepae" ... Following the description on https://help.ubuntu.com/community/PAE, I used forcepae and tried Try Lubuntu without installing again. That worked fine. dmesg | grep -i pae showed: [ 0.000000] Kernel command line: file=/cdrom/preseed/lubuntu.seed boot=casper initrd=/casper/initrd.lz quiet splash -- forcepae [ 0.008118] PAE forced! On the live-CD session, I tried installing Lubuntu double clicking on the install button on the desktop. Here, the CD starts running but then stops running and nothing happens. Next, I rebooted and tried installing Lubuntu directly from the boot menu screen using forcepae again. After a while, I receive the following error message: The installer encountered an unrecoverable error. A desktop session will now be run so that you may investigate the problem or try installing again. Hitting Enter brings me to the desktop. For what errors should I search? And how? Finally, I rebooted once more and tried Check disc for defects with forcepae option; no errors have been found. Now, I am wondering how to find the error or whether it would be better to follow advice c in https://help.ubuntu.com/community/PAE: "Move the hard disk to a computer on which the processor has PAE capability and PAE flag (that is, almost everything else than a Banias). Install the system as usual but don't add restricted drivers. After the install move the disk back." Thanks for some hints! Perhaps some of the following can help: On Lubuntu 12.04: cat /proc/cpuinfo processor : 0 vendor_id : GenuineIntel cpu family : 6 model : 13 model name : Intel(R) Pentium(R) M processor 1.50GHz stepping : 6 microcode : 0x17 cpu MHz : 600.000 cache size : 2048 KB fdiv_bug : no hlt_bug : no f00f_bug : no coma_bug : no fpu : yes fpu_exception : yes cpuid level : 2 wp : yes flags : fpu vme de pse tsc msr mce cx8 mtrr pge mca cmov clflush dts acpi mmx fxsr sse sse2 ss tm pbe up bts est tm2 bogomips : 1284.76 clflush size : 64 cache_alignment : 64 address sizes : 32 bits physical, 32 bits virtual power management: uname -a Linux humboldt 3.2.0-67-generic #101-Ubuntu SMP Tue Jul 15 17:45:51 UTC 2014 i686 i686 i386 GNU/Linux lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 12.04.5 LTS Release: 12.04 Codename: precise cpuid eax in eax ebx ecx edx 00000000 00000002 756e6547 6c65746e 49656e69 00000001 000006d6 00000816 00000180 afe9f9bf 00000002 02b3b001 000000f0 00000000 2c04307d 80000000 80000004 00000000 00000000 00000000 80000001 00000000 00000000 00000000 00000000 80000002 20202020 20202020 65746e49 2952286c 80000003 6e655020 6d756974 20295228 7270204d 80000004 7365636f 20726f73 30352e31 007a4847 Vendor ID: "GenuineIntel"; CPUID level 2 Intel-specific functions: Version 000006d6: Type 0 - Original OEM Family 6 - Pentium Pro Model 13 - Stepping 6 Reserved 0 Brand index: 22 [not in table] Extended brand string: " Intel(R) Pentium(R) M processor 1.50GHz" CLFLUSH instruction cache line size: 8 Feature flags afe9f9bf: FPU Floating Point Unit VME Virtual 8086 Mode Enhancements DE Debugging Extensions PSE Page Size Extensions TSC Time Stamp Counter MSR Model Specific Registers MCE Machine Check Exception CX8 COMPXCHG8B Instruction SEP Fast System Call MTRR Memory Type Range Registers PGE PTE Global Flag MCA Machine Check Architecture CMOV Conditional Move and Compare Instructions FGPAT Page Attribute Table CLFSH CFLUSH instruction DS Debug store ACPI Thermal Monitor and Clock Ctrl MMX MMX instruction set FXSR Fast FP/MMX Streaming SIMD Extensions save/restore SSE Streaming SIMD Extensions instruction set SSE2 SSE2 extensions SS Self Snoop TM Thermal monitor 31 reserved TLB and cache info: b0: unknown TLB/cache descriptor b3: unknown TLB/cache descriptor 02: Instruction TLB: 4MB pages, 4-way set assoc, 2 entries f0: unknown TLB/cache descriptor 7d: unknown TLB/cache descriptor 30: unknown TLB/cache descriptor 04: Data TLB: 4MB pages, 4-way set assoc, 8 entries 2c: unknown TLB/cache descriptor On Lubuntu 14.04.1 live-CD with forcepae: cat /proc/cpuinfo processor : 0 vendor_id : GenuineIntel cpu family : 6 model : 13 model name : Intel(R) Pentium(R) M processor 1.50GHz stepping : 6 microcode : 0x17 cpu MHz : 600.000 cache size : 2048 KB physical id : 0 siblings : 1 core id : 0 cpu cores : 1 apicid : 0 initial apicid : 0 fdiv_bug : no f00f_bug : no coma_bug : no fpu : yes fpu_exception : yes cpuid level : 2 wp : yes flags : fpu vme de pse tsc msr pae mce cx8 sep mtrr pge mca cmov clflush dts acpi mmx fxsr sse sse2 ss tm pbe bts est tm2 bogomips : 1284.68 clflush size : 64 cache_alignment : 64 address sizes : 36 bits physical, 32 bits virtual power management: uname -a Linux lubuntu 3.13.0-32-generic #57-Ubuntu SMP Tue Jul 15 03:51:12 UTC 2014 i686 i686 i686 GNU/Linux lsb_release -a No LSB modules are available. Distributor ID: Ubuntu Description: Ubuntu 14.04.1 LTS Release: 14.04 Codename: trusty cpuid CPU 0: vendor_id = "GenuineIntel" version information (1/eax): processor type = primary processor (0) family = Intel Pentium Pro/II/III/Celeron/Core/Core 2/Atom, AMD Athlon/Duron, Cyrix M2, VIA C3 (6) model = 0xd (13) stepping id = 0x6 (6) extended family = 0x0 (0) extended model = 0x0 (0) (simple synth) = Intel Pentium M (Dothan B1) / Celeron M (Dothan B1), 90nm miscellaneous (1/ebx): process local APIC physical ID = 0x0 (0) cpu count = 0x0 (0) CLFLUSH line size = 0x8 (8) brand index = 0x16 (22) brand id = 0x16 (22): Intel Pentium M, .13um feature information (1/edx): x87 FPU on chip = true virtual-8086 mode enhancement = true debugging extensions = true page size extensions = true time stamp counter = true RDMSR and WRMSR support = true physical address extensions = false machine check exception = true CMPXCHG8B inst. = true APIC on chip = false SYSENTER and SYSEXIT = true memory type range registers = true PTE global bit = true machine check architecture = true conditional move/compare instruction = true page attribute table = true page size extension = false processor serial number = false CLFLUSH instruction = true debug store = true thermal monitor and clock ctrl = true MMX Technology = true FXSAVE/FXRSTOR = true SSE extensions = true SSE2 extensions = true self snoop = true hyper-threading / multi-core supported = false therm. monitor = true IA64 = false pending break event = true feature information (1/ecx): PNI/SSE3: Prescott New Instructions = false PCLMULDQ instruction = false 64-bit debug store = false MONITOR/MWAIT = false CPL-qualified debug store = false VMX: virtual machine extensions = false SMX: safer mode extensions = false Enhanced Intel SpeedStep Technology = true thermal monitor 2 = true SSSE3 extensions = false context ID: adaptive or shared L1 data = false FMA instruction = false CMPXCHG16B instruction = false xTPR disable = false perfmon and debug = false process context identifiers = false direct cache access = false SSE4.1 extensions = false SSE4.2 extensions = false extended xAPIC support = false MOVBE instruction = false POPCNT instruction = false time stamp counter deadline = false AES instruction = false XSAVE/XSTOR states = false OS-enabled XSAVE/XSTOR = false AVX: advanced vector extensions = false F16C half-precision convert instruction = false RDRAND instruction = false hypervisor guest status = false cache and TLB information (2): 0xb0: instruction TLB: 4K, 4-way, 128 entries 0xb3: data TLB: 4K, 4-way, 128 entries 0x02: instruction TLB: 4M pages, 4-way, 2 entries 0xf0: 64 byte prefetching 0x7d: L2 cache: 2M, 8-way, sectored, 64 byte lines 0x30: L1 cache: 32K, 8-way, 64 byte lines 0x04: data TLB: 4M pages, 4-way, 8 entries 0x2c: L1 data cache: 32K, 8-way, 64 byte lines extended feature flags (0x80000001/edx): SYSCALL and SYSRET instructions = false execution disable = false 1-GB large page support = false RDTSCP = false 64-bit extensions technology available = false Intel feature flags (0x80000001/ecx): LAHF/SAHF supported in 64-bit mode = false LZCNT advanced bit manipulation = false 3DNow! PREFETCH/PREFETCHW instructions = false brand = " Intel(R) Pentium(R) M processor 1.50GHz" (multi-processing synth): none (multi-processing method): Intel leaf 1 (synth) = Intel Pentium M (Dothan B1), 90nm

    Read the article

  • Errors when installing Open Office

    - by user109036
    I followed the first set of instructions on this page to install Open Office: How to install Open Office? However, the last step which says to change the CHMOD of a folder, I got an error saying that the directory does not exist. Open Office now appears in my Ubuntu start menu, but clicking on it does nothing. I tried a reboot. Below is what I could copy from my terminal. I am running the latest Ubuntu. I have not uninstalled Libreoffice as suggested somewhere. The reason is that in the Ubuntu software centre, Libre office appears to be made up of several components and I don't know which ones to remove (or all maybe?). They are Libreoffice Draw, Math, Writer, Calc. After this operation, 480 MB of additional disk space will be used. Do you want to continue [Y/n]? y Get:1 http://gb.archive.ubuntu.com/ubuntu/ quantal-updates/universe openjdk-6-jre-lib all 6b24-1.11.5-0ubuntu1~12.10.1 [6,135 kB] Get:2 http://ppa.launchpad.net/upubuntu-com/office/ubuntu/ quantal/main openoffice amd64 3.4~oneiric [321 MB] Get:3 http://gb.archive.ubuntu.com/ubuntu/ quantal/main ca-certificates-java all 20120721 [13.2 kB] Get:4 http://gb.archive.ubuntu.com/ubuntu/ quantal/main tzdata-java all 2012e-0ubuntu2 [140 kB] Get:5 http://gb.archive.ubuntu.com/ubuntu/ quantal/main java-common all 0.43ubuntu3 [61.7 kB] Get:6 http://gb.archive.ubuntu.com/ubuntu/ quantal-updates/universe openjdk-6-jre-headless amd64 6b24-1.11.5-0ubuntu1~12.10.1 [25.4 MB] Get:7 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libgif4 amd64 4.1.6-9.1ubuntu1 [31.3 kB] Get:8 http://gb.archive.ubuntu.com/ubuntu/ quantal-updates/universe openjdk-6-jre amd64 6b24-1.11.5-0ubuntu1~12.10.1 [234 kB] Get:9 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libatk-wrapper-java all 0.30.4-0ubuntu4 [29.8 kB] Get:10 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libatk-wrapper-java-jni amd64 0.30.4-0ubuntu4 [31.1 kB] Get:11 http://gb.archive.ubuntu.com/ubuntu/ quantal/main xorg-sgml-doctools all 1:1.10-1 [12.0 kB] Get:12 http://gb.archive.ubuntu.com/ubuntu/ quantal/main x11proto-core-dev all 7.0.23-1 [744 kB] Get:13 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libice-dev amd64 2:1.0.8-2 [57.6 kB] Get:14 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libpthread-stubs0 amd64 0.3-3 [3,258 B] Get:15 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libpthread-stubs0-dev amd64 0.3-3 [2,866 B] Get:16 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libsm-dev amd64 2:1.2.1-2 [19.9 kB] Get:17 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libxau-dev amd64 1:1.0.7-1 [10.2 kB] Get:18 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libxdmcp-dev amd64 1:1.1.1-1 [26.9 kB] Get:19 http://gb.archive.ubuntu.com/ubuntu/ quantal/main x11proto-input-dev all 2.2-1 [133 kB] Get:20 http://gb.archive.ubuntu.com/ubuntu/ quantal/main x11proto-kb-dev all 1.0.6-2 [269 kB] Get:21 http://gb.archive.ubuntu.com/ubuntu/ quantal/main xtrans-dev all 1.2.7-1 [84.3 kB] Get:22 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libxcb1-dev amd64 1.8.1-1ubuntu1 [82.6 kB] Get:23 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libx11-dev amd64 2:1.5.0-1 [912 kB] Get:24 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libx11-doc all 2:1.5.0-1 [2,460 kB] Get:25 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libxt-dev amd64 1:1.1.3-1 [492 kB] Get:26 http://gb.archive.ubuntu.com/ubuntu/ quantal/main ttf-dejavu-extra all 2.33-2ubuntu1 [3,420 kB] Get:27 http://gb.archive.ubuntu.com/ubuntu/ quantal-updates/universe icedtea-6-jre-cacao amd64 6b24-1.11.5-0ubuntu1~12.10.1 [417 kB] Get:28 http://gb.archive.ubuntu.com/ubuntu/ quantal-updates/universe icedtea-6-jre-jamvm amd64 6b24-1.11.5-0ubuntu1~12.10.1 [581 kB] Get:29 http://gb.archive.ubuntu.com/ubuntu/ quantal-updates/main icedtea-netx-common all 1.3-1ubuntu1.1 [617 kB] Get:30 http://gb.archive.ubuntu.com/ubuntu/ quantal-updates/main icedtea-netx amd64 1.3-1ubuntu1.1 [16.2 kB] Get:31 http://gb.archive.ubuntu.com/ubuntu/ quantal-updates/universe openjdk-6-jdk amd64 6b24-1.11.5-0ubuntu1~12.10.1 [11.1 MB] Fetched 374 MB in 9min 18s (671 kB/s) Extract templates from packages: 100% Selecting previously unselected package openjdk-6-jre-lib. (Reading database ... 143191 files and directories currently installed.) Unpacking openjdk-6-jre-lib (from .../openjdk-6-jre-lib_6b24-1.11.5-0ubuntu1~12.10.1_all.deb) ... Selecting previously unselected package ca-certificates-java. Unpacking ca-certificates-java (from .../ca-certificates-java_20120721_all.deb) ... Selecting previously unselected package tzdata-java. Unpacking tzdata-java (from .../tzdata-java_2012e-0ubuntu2_all.deb) ... Selecting previously unselected package java-common. Unpacking java-common (from .../java-common_0.43ubuntu3_all.deb) ... Selecting previously unselected package openjdk-6-jre-headless:amd64. Unpacking openjdk-6-jre-headless:amd64 (from .../openjdk-6-jre-headless_6b24-1.11.5-0ubuntu1~12.10.1_amd64.deb) ... Selecting previously unselected package libgif4:amd64. Unpacking libgif4:amd64 (from .../libgif4_4.1.6-9.1ubuntu1_amd64.deb) ... Selecting previously unselected package openjdk-6-jre:amd64. Unpacking openjdk-6-jre:amd64 (from .../openjdk-6-jre_6b24-1.11.5-0ubuntu1~12.10.1_amd64.deb) ... Selecting previously unselected package libatk-wrapper-java. Unpacking libatk-wrapper-java (from .../libatk-wrapper-java_0.30.4-0ubuntu4_all.deb) ... Selecting previously unselected package libatk-wrapper-java-jni:amd64. Unpacking libatk-wrapper-java-jni:amd64 (from .../libatk-wrapper-java-jni_0.30.4-0ubuntu4_amd64.deb) ... Selecting previously unselected package xorg-sgml-doctools. Unpacking xorg-sgml-doctools (from .../xorg-sgml-doctools_1%3a1.10-1_all.deb) ... Selecting previously unselected package x11proto-core-dev. Unpacking x11proto-core-dev (from .../x11proto-core-dev_7.0.23-1_all.deb) ... Selecting previously unselected package libice-dev:amd64. Unpacking libice-dev:amd64 (from .../libice-dev_2%3a1.0.8-2_amd64.deb) ... Selecting previously unselected package libpthread-stubs0:amd64. Unpacking libpthread-stubs0:amd64 (from .../libpthread-stubs0_0.3-3_amd64.deb) ... Selecting previously unselected package libpthread-stubs0-dev:amd64. Unpacking libpthread-stubs0-dev:amd64 (from .../libpthread-stubs0-dev_0.3-3_amd64.deb) ... Selecting previously unselected package libsm-dev:amd64. Unpacking libsm-dev:amd64 (from .../libsm-dev_2%3a1.2.1-2_amd64.deb) ... Selecting previously unselected package libxau-dev:amd64. Unpacking libxau-dev:amd64 (from .../libxau-dev_1%3a1.0.7-1_amd64.deb) ... Selecting previously unselected package libxdmcp-dev:amd64. Unpacking libxdmcp-dev:amd64 (from .../libxdmcp-dev_1%3a1.1.1-1_amd64.deb) ... Selecting previously unselected package x11proto-input-dev. Unpacking x11proto-input-dev (from .../x11proto-input-dev_2.2-1_all.deb) ... Selecting previously unselected package x11proto-kb-dev. Unpacking x11proto-kb-dev (from .../x11proto-kb-dev_1.0.6-2_all.deb) ... Selecting previously unselected package xtrans-dev. Unpacking xtrans-dev (from .../xtrans-dev_1.2.7-1_all.deb) ... Selecting previously unselected package libxcb1-dev:amd64. Unpacking libxcb1-dev:amd64 (from .../libxcb1-dev_1.8.1-1ubuntu1_amd64.deb) ... Selecting previously unselected package libx11-dev:amd64. Unpacking libx11-dev:amd64 (from .../libx11-dev_2%3a1.5.0-1_amd64.deb) ... Selecting previously unselected package libx11-doc. Unpacking libx11-doc (from .../libx11-doc_2%3a1.5.0-1_all.deb) ... Selecting previously unselected package libxt-dev:amd64. Unpacking libxt-dev:amd64 (from .../libxt-dev_1%3a1.1.3-1_amd64.deb) ... Selecting previously unselected package ttf-dejavu-extra. Unpacking ttf-dejavu-extra (from .../ttf-dejavu-extra_2.33-2ubuntu1_all.deb) ... Selecting previously unselected package icedtea-6-jre-cacao:amd64. Unpacking icedtea-6-jre-cacao:amd64 (from .../icedtea-6-jre-cacao_6b24-1.11.5-0ubuntu1~12.10.1_amd64.deb) ... Selecting previously unselected package icedtea-6-jre-jamvm:amd64. Unpacking icedtea-6-jre-jamvm:amd64 (from .../icedtea-6-jre-jamvm_6b24-1.11.5-0ubuntu1~12.10.1_amd64.deb) ... Selecting previously unselected package icedtea-netx-common. Unpacking icedtea-netx-common (from .../icedtea-netx-common_1.3-1ubuntu1.1_all.deb) ... Selecting previously unselected package icedtea-netx:amd64. Unpacking icedtea-netx:amd64 (from .../icedtea-netx_1.3-1ubuntu1.1_amd64.deb) ... Selecting previously unselected package openjdk-6-jdk:amd64. Unpacking openjdk-6-jdk:amd64 (from .../openjdk-6-jdk_6b24-1.11.5-0ubuntu1~12.10.1_amd64.deb) ... Selecting previously unselected package openoffice. Unpacking openoffice (from .../openoffice_3.4~oneiric_amd64.deb) ... Processing triggers for doc-base ... Processing 2 added doc-base files... Processing triggers for man-db ... Processing triggers for desktop-file-utils ... Processing triggers for bamfdaemon ... Rebuilding /usr/share/applications/bamf.index... Processing triggers for gnome-menus ... Processing triggers for hicolor-icon-theme ... Processing triggers for fontconfig ... Processing triggers for gnome-icon-theme ... Processing triggers for shared-mime-info ... Setting up tzdata-java (2012e-0ubuntu2) ... Setting up java-common (0.43ubuntu3) ... Setting up libgif4:amd64 (4.1.6-9.1ubuntu1) ... Setting up xorg-sgml-doctools (1:1.10-1) ... Setting up x11proto-core-dev (7.0.23-1) ... Setting up libice-dev:amd64 (2:1.0.8-2) ... Setting up libpthread-stubs0:amd64 (0.3-3) ... Setting up libpthread-stubs0-dev:amd64 (0.3-3) ... Setting up libsm-dev:amd64 (2:1.2.1-2) ... Setting up libxau-dev:amd64 (1:1.0.7-1) ... Setting up libxdmcp-dev:amd64 (1:1.1.1-1) ... Setting up x11proto-input-dev (2.2-1) ... Setting up x11proto-kb-dev (1.0.6-2) ... Setting up xtrans-dev (1.2.7-1) ... Setting up libxcb1-dev:amd64 (1.8.1-1ubuntu1) ... Setting up libx11-dev:amd64 (2:1.5.0-1) ... Setting up libx11-doc (2:1.5.0-1) ... Setting up libxt-dev:amd64 (1:1.1.3-1) ... Setting up ttf-dejavu-extra (2.33-2ubuntu1) ... Setting up icedtea-netx-common (1.3-1ubuntu1.1) ... Setting up openjdk-6-jre-lib (6b24-1.11.5-0ubuntu1~12.10.1) ... Setting up openjdk-6-jre-headless:amd64 (6b24-1.11.5-0ubuntu1~12.10.1) ... update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/jre/bin/java to provide /usr/bin/java (java) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/jre/bin/keytool to provide /usr/bin/keytool (keytool) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/jre/bin/pack200 to provide /usr/bin/pack200 (pack200) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/jre/bin/rmid to provide /usr/bin/rmid (rmid) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/jre/bin/rmiregistry to provide /usr/bin/rmiregistry (rmiregistry) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/jre/bin/unpack200 to provide /usr/bin/unpack200 (unpack200) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/jre/bin/orbd to provide /usr/bin/orbd (orbd) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/jre/bin/servertool to provide /usr/bin/servertool (servertool) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/jre/bin/tnameserv to provide /usr/bin/tnameserv (tnameserv) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/jre/lib/jexec to provide /usr/bin/jexec (jexec) in auto mode Setting up ca-certificates-java (20120721) ... Adding debian:Deutsche_Telekom_Root_CA_2.pem Adding debian:Comodo_Trusted_Services_root.pem Adding debian:Certum_Trusted_Network_CA.pem Adding debian:thawte_Primary_Root_CA_-_G2.pem Adding debian:UTN_USERFirst_Hardware_Root_CA.pem Adding debian:AddTrust_Low-Value_Services_Root.pem Adding debian:Microsec_e-Szigno_Root_CA.pem Adding debian:SwissSign_Silver_CA_-_G2.pem Adding debian:ComSign_Secured_CA.pem Adding debian:Buypass_Class_2_CA_1.pem Adding debian:Verisign_Class_1_Public_Primary_Certification_Authority_-_G3.pem Adding debian:Certum_Root_CA.pem Adding debian:AddTrust_External_Root.pem Adding debian:Chambers_of_Commerce_Root_-_2008.pem Adding debian:Starfield_Root_Certificate_Authority_-_G2.pem Adding debian:Verisign_Class_1_Public_Primary_Certification_Authority_-_G2.pem Adding debian:Visa_eCommerce_Root.pem Adding debian:Digital_Signature_Trust_Co._Global_CA_3.pem Adding debian:AC_Raíz_Certicámara_S.A..pem Adding debian:NetLock_Arany_=Class_Gold=_Fotanúsítvány.pem Adding debian:Taiwan_GRCA.pem Adding debian:Camerfirma_Chambers_of_Commerce_Root.pem Adding debian:Juur-SK.pem Adding debian:Entrust.net_Premium_2048_Secure_Server_CA.pem Adding debian:XRamp_Global_CA_Root.pem Adding debian:Security_Communication_RootCA2.pem Adding debian:AddTrust_Qualified_Certificates_Root.pem Adding debian:NetLock_Qualified_=Class_QA=_Root.pem Adding debian:TC_TrustCenter_Class_2_CA_II.pem Adding debian:DST_ACES_CA_X6.pem Adding debian:thawte_Primary_Root_CA.pem Adding debian:thawte_Primary_Root_CA_-_G3.pem Adding debian:GeoTrust_Universal_CA_2.pem Adding debian:ACEDICOM_Root.pem Adding debian:Security_Communication_EV_RootCA1.pem Adding debian:America_Online_Root_Certification_Authority_2.pem Adding debian:TC_TrustCenter_Universal_CA_I.pem Adding debian:SwissSign_Platinum_CA_-_G2.pem Adding debian:Global_Chambersign_Root_-_2008.pem Adding debian:SecureSign_RootCA11.pem Adding debian:GeoTrust_Global_CA_2.pem Adding debian:Buypass_Class_3_CA_1.pem Adding debian:Baltimore_CyberTrust_Root.pem Adding debian:UbuntuOne-Go_Daddy_Class_2_CA.pem Adding debian:Equifax_Secure_eBusiness_CA_1.pem Adding debian:SwissSign_Gold_CA_-_G2.pem Adding debian:AffirmTrust_Premium_ECC.pem Adding debian:TC_TrustCenter_Universal_CA_III.pem Adding debian:ca.pem Adding debian:Verisign_Class_3_Public_Primary_Certification_Authority_-_G2.pem Adding debian:NetLock_Express_=Class_C=_Root.pem Adding debian:VeriSign_Class_3_Public_Primary_Certification_Authority_-_G5.pem Adding debian:Firmaprofesional_Root_CA.pem Adding debian:Comodo_Secure_Services_root.pem Adding debian:cacert.org.pem Adding debian:GeoTrust_Primary_Certification_Authority.pem Adding debian:RSA_Security_2048_v3.pem Adding debian:Staat_der_Nederlanden_Root_CA.pem Adding debian:Cybertrust_Global_Root.pem Adding debian:DigiCert_High_Assurance_EV_Root_CA.pem Adding debian:TDC_OCES_Root_CA.pem Adding debian:A-Trust-nQual-03.pem Adding debian:Equifax_Secure_CA.pem Adding debian:Digital_Signature_Trust_Co._Global_CA_1.pem Adding debian:GeoTrust_Global_CA.pem Adding debian:Starfield_Class_2_CA.pem Adding debian:ApplicationCA_-_Japanese_Government.pem Adding debian:Swisscom_Root_CA_1.pem Adding debian:Verisign_Class_2_Public_Primary_Certification_Authority_-_G2.pem Adding debian:Camerfirma_Global_Chambersign_Root.pem Adding debian:QuoVadis_Root_CA_3.pem Adding debian:QuoVadis_Root_CA.pem Adding debian:Comodo_AAA_Services_root.pem Adding debian:ComSign_CA.pem Adding debian:AddTrust_Public_Services_Root.pem Adding debian:DigiCert_Assured_ID_Root_CA.pem Adding debian:UTN_DATACorp_SGC_Root_CA.pem Adding debian:CA_Disig.pem Adding debian:E-Guven_Kok_Elektronik_Sertifika_Hizmet_Saglayicisi.pem Adding debian:GlobalSign_Root_CA_-_R3.pem Adding debian:QuoVadis_Root_CA_2.pem Adding debian:Entrust_Root_Certification_Authority.pem Adding debian:GTE_CyberTrust_Global_Root.pem Adding debian:ValiCert_Class_1_VA.pem Adding debian:Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem Adding debian:GeoTrust_Primary_Certification_Authority_-_G2.pem Adding debian:spi-ca-2003.pem Adding debian:America_Online_Root_Certification_Authority_1.pem Adding debian:AffirmTrust_Premium.pem Adding debian:Sonera_Class_1_Root_CA.pem Adding debian:Verisign_Class_2_Public_Primary_Certification_Authority_-_G3.pem Adding debian:Certplus_Class_2_Primary_CA.pem Adding debian:TURKTRUST_Certificate_Services_Provider_Root_2.pem Adding debian:Network_Solutions_Certificate_Authority.pem Adding debian:Go_Daddy_Class_2_CA.pem Adding debian:StartCom_Certification_Authority.pem Adding debian:Hongkong_Post_Root_CA_1.pem Adding debian:Hellenic_Academic_and_Research_Institutions_RootCA_2011.pem Adding debian:Thawte_Premium_Server_CA.pem Adding debian:EBG_Elektronik_Sertifika_Hizmet_Saglayicisi.pem Adding debian:TURKTRUST_Certificate_Services_Provider_Root_1.pem Adding debian:NetLock_Business_=Class_B=_Root.pem Adding debian:Microsec_e-Szigno_Root_CA_2009.pem Adding debian:DigiCert_Global_Root_CA.pem Adding debian:VeriSign_Class_3_Public_Primary_Certification_Authority_-_G4.pem Adding debian:IGC_A.pem Adding debian:TWCA_Root_Certification_Authority.pem Adding debian:S-TRUST_Authentication_and_Encryption_Root_CA_2005_PN.pem Adding debian:VeriSign_Universal_Root_Certification_Authority.pem Adding debian:DST_Root_CA_X3.pem Adding debian:Verisign_Class_1_Public_Primary_Certification_Authority.pem Adding debian:Root_CA_Generalitat_Valenciana.pem Adding debian:UTN_USERFirst_Email_Root_CA.pem Adding debian:ssl-cert-snakeoil.pem Adding debian:Starfield_Services_Root_Certificate_Authority_-_G2.pem Adding debian:GeoTrust_Primary_Certification_Authority_-_G3.pem Adding debian:Certinomis_-_Autorité_Racine.pem Adding debian:Verisign_Class_3_Public_Primary_Certification_Authority.pem Adding debian:TDC_Internet_Root_CA.pem Adding debian:UbuntuOne-ValiCert_Class_2_VA.pem Adding debian:AffirmTrust_Commercial.pem Adding debian:spi-cacert-2008.pem Adding debian:Izenpe.com.pem Adding debian:EC-ACC.pem Adding debian:Go_Daddy_Root_Certificate_Authority_-_G2.pem Adding debian:COMODO_ECC_Certification_Authority.pem Adding debian:CNNIC_ROOT.pem Adding debian:NetLock_Notary_=Class_A=_Root.pem Adding debian:Equifax_Secure_eBusiness_CA_2.pem Adding debian:Verisign_Class_3_Public_Primary_Certification_Authority_-_G3.pem Adding debian:Secure_Global_CA.pem Adding debian:UbuntuOne-Go_Daddy_CA.pem Adding debian:GeoTrust_Universal_CA.pem Adding debian:Wells_Fargo_Root_CA.pem Adding debian:Thawte_Server_CA.pem Adding debian:WellsSecure_Public_Root_Certificate_Authority.pem Adding debian:TC_TrustCenter_Class_3_CA_II.pem Adding debian:COMODO_Certification_Authority.pem Adding debian:Equifax_Secure_Global_eBusiness_CA.pem Adding debian:Security_Communication_Root_CA.pem Adding debian:GlobalSign_Root_CA_-_R2.pem Adding debian:TÜBITAK_UEKAE_Kök_Sertifika_Hizmet_Saglayicisi_-_Sürüm_3.pem Adding debian:Verisign_Class_4_Public_Primary_Certification_Authority_-_G3.pem Adding debian:certSIGN_ROOT_CA.pem Adding debian:RSA_Root_Certificate_1.pem Adding debian:ePKI_Root_Certification_Authority.pem Adding debian:Entrust.net_Secure_Server_CA.pem Adding debian:OISTE_WISeKey_Global_Root_GA_CA.pem Adding debian:Sonera_Class_2_Root_CA.pem Adding debian:Certigna.pem Adding debian:AffirmTrust_Networking.pem Adding debian:ValiCert_Class_2_VA.pem Adding debian:GlobalSign_Root_CA.pem Adding debian:Staat_der_Nederlanden_Root_CA_-_G2.pem Adding debian:SecureTrust_CA.pem done. Setting up openjdk-6-jre:amd64 (6b24-1.11.5-0ubuntu1~12.10.1) ... update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/jre/bin/policytool to provide /usr/bin/policytool (policytool) in auto mode Setting up libatk-wrapper-java (0.30.4-0ubuntu4) ... Setting up icedtea-6-jre-cacao:amd64 (6b24-1.11.5-0ubuntu1~12.10.1) ... Setting up icedtea-6-jre-jamvm:amd64 (6b24-1.11.5-0ubuntu1~12.10.1) ... Setting up icedtea-netx:amd64 (1.3-1ubuntu1.1) ... update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/jre/bin/javaws to provide /usr/bin/javaws (javaws) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/jre/bin/itweb-settings to provide /usr/bin/itweb-settings (itweb-settings) in auto mode update-alternatives: using /usr/lib/jvm/java-7-openjdk-amd64/jre/bin/javaws to provide /usr/bin/javaws (javaws) in auto mode update-alternatives: using /usr/lib/jvm/java-7-openjdk-amd64/jre/bin/itweb-settings to provide /usr/bin/itweb-settings (itweb-settings) in auto mode Setting up openjdk-6-jdk:amd64 (6b24-1.11.5-0ubuntu1~12.10.1) ... update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/appletviewer to provide /usr/bin/appletviewer (appletviewer) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/extcheck to provide /usr/bin/extcheck (extcheck) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/idlj to provide /usr/bin/idlj (idlj) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/jar to provide /usr/bin/jar (jar) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/jarsigner to provide /usr/bin/jarsigner (jarsigner) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/javac to provide /usr/bin/javac (javac) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/javadoc to provide /usr/bin/javadoc (javadoc) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/javah to provide /usr/bin/javah (javah) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/javap to provide /usr/bin/javap (javap) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/jconsole to provide /usr/bin/jconsole (jconsole) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/jdb to provide /usr/bin/jdb (jdb) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/jhat to provide /usr/bin/jhat (jhat) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/jinfo to provide /usr/bin/jinfo (jinfo) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/jmap to provide /usr/bin/jmap (jmap) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/jps to provide /usr/bin/jps (jps) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/jrunscript to provide /usr/bin/jrunscript (jrunscript) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/jsadebugd to provide /usr/bin/jsadebugd (jsadebugd) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/jstack to provide /usr/bin/jstack (jstack) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/jstat to provide /usr/bin/jstat (jstat) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/jstatd to provide /usr/bin/jstatd (jstatd) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/native2ascii to provide /usr/bin/native2ascii (native2ascii) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/rmic to provide /usr/bin/rmic (rmic) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/schemagen to provide /usr/bin/schemagen (schemagen) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/serialver to provide /usr/bin/serialver (serialver) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/wsgen to provide /usr/bin/wsgen (wsgen) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/wsimport to provide /usr/bin/wsimport (wsimport) in auto mode update-alternatives: using /usr/lib/jvm/java-6-openjdk-amd64/bin/xjc to provide /usr/bin/xjc (xjc) in auto mode Setting up openoffice (3.4~oneiric) ... Setting up libatk-wrapper-java-jni:amd64 (0.30.4-0ubuntu4) ... Processing triggers for libc-bin ... ldconfig deferred processing now taking place philip@X301-2:~$ sudo apt-get install libxrandr2:i386 libxinerama1:i386 Reading package lists... Done Building dependency tree Reading state information... Done The following package was automatically installed and is no longer required: linux-headers-3.5.0-17 Use 'apt-get autoremove' to remove it. The following extra packages will be installed: gcc-4.7-base:i386 libc6:i386 libgcc1:i386 libx11-6:i386 libxau6:i386 libxcb1:i386 libxdmcp6:i386 libxext6:i386 libxrender1:i386 Suggested packages: glibc-doc:i386 locales:i386 The following NEW packages will be installed gcc-4.7-base:i386 libc6:i386 libgcc1:i386 libx11-6:i386 libxau6:i386 libxcb1:i386 libxdmcp6:i386 libxext6:i386 libxinerama1:i386 libxrandr2:i386 libxrender1:i386 0 upgraded, 11 newly installed, 0 to remove and 93 not upgraded. Need to get 4,936 kB of archives. After this operation, 11.9 MB of additional disk space will be used. Do you want to continue [Y/n]? y Get:1 http://gb.archive.ubuntu.com/ubuntu/ quantal/main gcc-4.7-base i386 4.7.2-2ubuntu1 [15.5 kB] Get:2 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libc6 i386 2.15-0ubuntu20 [3,940 kB] Get:3 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libgcc1 i386 1:4.7.2-2ubuntu1 [53.5 kB] Get:4 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libxau6 i386 1:1.0.7-1 [8,582 B] Get:5 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libxdmcp6 i386 1:1.1.1-1 [13.1 kB] Get:6 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libxcb1 i386 1.8.1-1ubuntu1 [48.7 kB] Get:7 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libx11-6 i386 2:1.5.0-1 [776 kB] Get:8 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libxext6 i386 2:1.3.1-2 [33.9 kB] Get:9 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libxinerama1 i386 2:1.1.2-1 [8,118 B] Get:10 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libxrender1 i386 1:0.9.7-1 [20.1 kB] Get:11 http://gb.archive.ubuntu.com/ubuntu/ quantal/main libxrandr2 i386 2:1.4.0-1 [18.8 kB] Fetched 4,936 kB in 30s (161 kB/s) Preconfiguring packages ... Selecting previously unselected package gcc-4.7-base:i386. (Reading database ... 146005 files and directories currently installed.) Unpacking gcc-4.7-base:i386 (from .../gcc-4.7-base_4.7.2-2ubuntu1_i386.deb) ... Selecting previously unselected package libc6:i386. Unpacking libc6:i386 (from .../libc6_2.15-0ubuntu20_i386.deb) ... Selecting previously unselected package libgcc1:i386. Unpacking libgcc1:i386 (from .../libgcc1_1%3a4.7.2-2ubuntu1_i386.deb) ... Selecting previously unselected package libxau6:i386. Unpacking libxau6:i386 (from .../libxau6_1%3a1.0.7-1_i386.deb) ... Selecting previously unselected package libxdmcp6:i386. Unpacking libxdmcp6:i386 (from .../libxdmcp6_1%3a1.1.1-1_i386.deb) ... Selecting previously unselected package libxcb1:i386. Unpacking libxcb1:i386 (from .../libxcb1_1.8.1-1ubuntu1_i386.deb) ... Selecting previously unselected package libx11-6:i386. Unpacking libx11-6:i386 (from .../libx11-6_2%3a1.5.0-1_i386.deb) ... Selecting previously unselected package libxext6:i386. Unpacking libxext6:i386 (from .../libxext6_2%3a1.3.1-2_i386.deb) ... Selecting previously unselected package libxinerama1:i386. Unpacking libxinerama1:i386 (from .../libxinerama1_2%3a1.1.2-1_i386.deb) ... Selecting previously unselected package libxrender1:i386. Unpacking libxrender1:i386 (from .../libxrender1_1%3a0.9.7-1_i386.deb) ... Selecting previously unselected package libxrandr2:i386. Unpacking libxrandr2:i386 (from .../libxrandr2_2%3a1.4.0-1_i386.deb) ... Setting up gcc-4.7-base:i386 (4.7.2-2ubuntu1) ... Setting up libc6:i386 (2.15-0ubuntu20) ... Setting up libgcc1:i386 (1:4.7.2-2ubuntu1) ... Setting up libxau6:i386 (1:1.0.7-1) ... Setting up libxdmcp6:i386 (1:1.1.1-1) ... Setting up libxcb1:i386 (1.8.1-1ubuntu1) ... Setting up libx11-6:i386 (2:1.5.0-1) ... Setting up libxext6:i386 (2:1.3.1-2) ... Setting up libxinerama1:i386 (2:1.1.2-1) ... Setting up libxrender1:i386 (1:0.9.7-1) ... Setting up libxrandr2:i386 (2:1.4.0-1) ... Processing triggers for libc-bin ... ldconfig deferred processing now taking place $ sudo chmod a+rx /opt/openoffice.org3/share/uno_packages/cache/uno_packages chmod: cannot access `/opt/openoffice.org3/share/uno_packages/cache/uno_packages': No such file or directory

    Read the article

  • CodePlex Daily Summary for Saturday, May 15, 2010

    CodePlex Daily Summary for Saturday, May 15, 2010New ProjectsBizTalk EDI Guidance: BizTalk EDI Guidance is intended to simplify the delivery of EDI solutions by leveraging the ESB Toolkit. This project is currently Alpha and sh...Continues Integration Sample: I'm providing a series of blog post to show a complete CI process using CruiseControl.Net and msbuild. The source code for this series is hosted here.DioM2D: My Dragons in our Midst RPG. Runs on my custom Starlight Engine.Ethical Hacking ASP.NET: Security tools and guidelines for white-hat hacking and protecting ASP.NET web applications.Farseer Engine with XNATouch: Farseer is great engine for game physics. This implementation uses XNATouch framework.Feature Builder Guidance Extensions: Feature Builder Guidance Extensions are Feature Extensions which extend the guidance for the Feature Building experience. Each FBGX will be suppli...Microsoft Office Document Security: MODS is a plugin for office 2007 thats includes Hash Encryption, Hex Convertion and more. Plugins: MODS For Word still working on (MODS for Excel ...Minimize Engine (XNA): The Minimize Engine is a basic 3D Games Engine created using XNA, with its primary focus around Grid Based games.MSForge TownCrier: This project is meant to build a notification and calling system for MSForge.net User Groups.NatureProtector: Silverlight 4 project.OutSync: OutSync is a free Windows desktop application that syncs photos of your Facebook friends with matching contacts in Microsoft Outlook. It allows you...Quick Save Images, Clipboard save to file, Quick save, bmp, png, jpeg, Image: ClipSa is a very small tool for very quick picture saving. You put some picture into the clipboard (PrintScrn/Alt-PrintScrn/Ctrl-C), ClipSa saves ...ResHelper Manager: Resource strings management tool that creates localization files for any type of localization target (asp.net, wpf and so on...)SecureCookieHttpModule: Secure your session cookie (and other session-based) cookies for replay attacks using this easy to use ASP.NET HttpModule.simpleChMS: A Church Management System (ChMS) designed for churches or ministries like youth groups that want to facilitate better care or theie membership. Fo...sMAPtool: -SPDomainObject: mapping strong type objects to sp listsSQL Trim: This project aims at developing a universal trim function for Microsoft SQL Server. It trims: 1) pre spaces 2) post spaces 3) double spaces 3) subs...TurretGunner: mt-experienceNew ReleasesBeanProxy: BeanProxy 3.0: BeanProxy is a C# (.NET 3.5) library housing classes that facilitates unit testing. Any non-static, public interface/class or abstract class can be...Blueset Studio Opensource Projects: 蓝色之风记事本 0.2 Alpha: 一个超级Bug版本……CSharp Intellisense: V2.1: - Bug fix (Pascal Casing)DioM2D: DioM2D0.01: http://www.dragonsinourmidst.com/forums/showthread.php?p=690058#post690058Ethical Hacking ASP.NET: Version 1.0.0.1: This is the initial release of the project. Read more about the available tests and features on the Documentation tab. You need the full .NET Frame...Event Scavenger: Collector service update - version 3.2.4: Added check if the database connection string is set up in the config file.Feature Builder Guidance Extensions: FBGX-Binaries: This release consists of a zip file containing all the VSIXs resulting from building each of the FBGX packages found here as source. This will mak...Floe IRC Client: Floe IRC Client 2010-05 R2: - Detaching windows (right click on the tabs to detach them) - Highlight lines with your nick or other patterns - Fixed several bugs - Tabs can now...Free language translator and file converter: Free Language Translator 1.96: Fixed some minor bugs and improved the UI a bit. If you can not install the msi file you might be missing some prerequisites. You can try running t...Geocache Downloader: release 1.0: This is the first release.kp.net: Alpha release is avalable: The goal of this alpha release is to try the code in some production scenarios and find out what features should be tuned.Live-Exchange Calendar Sync: Live-Exchange Calendar Sync: Live-Exchange Calendar Sync Beta May 14, 2010 release of Live-Exchange Calendar Sync 1.0 BETA. (Version 45334) Getting StartedInfo about installat...MAPILab Explorer for SharePoint: MAPILab Explorer for SharePoint ver 2.1.1: 1) Small bug fixed that appears on first start (when earliers versions wasn't installed). How to install:Download ZIP file and extract it on Sha...Microsoft Office Document Security: MODS 4 WORD (SOURCE INCLUDED): Includes Source CodeMoonyDesk (windows desktop widgets): MoonyDesk Alpha: MoonyDesk Alpha (some memory improvements)OnTopReplica: Release 2.9.3: Some bugfixes and improvements. Czech translation added (thanks René Mihula).OutSync: OutSync v1.0.100.0: OutSync v1.0.100.0 is the final release by Mel before the move to CodePlex. I have tested it on Windows 7 32bit and 64bit with Office 2007 and it ...Quick Save Images, Clipboard save to file, Quick save, bmp, png, jpeg, Image: Clipsa v 0.1: Download and extract to any place 2 files - clipSa.exe and clipSa.exe.config Run clipSa.exe. That's all.ResHelper Manager: ResHelperManager: List of changes applied to this version of ResHelper is included in main download zip package. Example sourcesIn Source Code tab are sources of De...Rx Contrib: V1.3: - Bug Fix - BufferWithTimeOrCount with flexible time period setting when ever the time period elapsed...SharePoint DVK Integration: SharePoint 2007 DVK integration v1.0.3: Fixes Fixed default field bindings. I rebound too many fields on every page load. Fixed extension replacing on creating target url (threw it out)...ShoutcastStast for DotNetNuke: DNN_ShoutcastStats alpha 05.00.495: First Alpha release of ShoutcastStats Module for DotNetNuke This first alpha version of the ShoutcastStats Module for DotNetNuke is still in devel...SilverPart 2.1: SilverPart 2.1: SilverPart 2.1 This interim release fixes some major bugs related to Firefox and anonymous access. - Fix for Issue ID 4005 - SilverPart does not w...sMAPtool: sMAPedit v0.7c (Base Release with Maps): Fixed: force a gargabe collection update to prevent pictureBox's memory leak Added: essential map pack with all basic maps in jpg format Added:...SQL Trim: Trim: Initial releaseSSIS Multiple Hash: Multiple Hash V1.2.1: This is version 1.2.1 of the Multiple Hash SSIS Component. It supports SQL 2005 and SQL 2008, although you have to download the correct install pa...StreamInsight Yahoo Finance input adapter example: StockTicker_v1_0_RTM: Updated for StreamInsight RTM.Update Controls .NET: 2.1.0.0: Automatic dependency management for WPF and Silverlight data binding. This release combines both the WPF and Silverlight assemblies into one insta...VCC: Latest build, v2.1.30514.0: Automatic drop of latest buildMost Popular ProjectsRawrWBFS ManagerAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight ToolkitWindows Presentation Foundation (WPF)patterns & practices – Enterprise LibraryMicrosoft SQL Server Community & SamplesPHPExcelASP.NETMost Active Projectspatterns & practices – Enterprise LibraryMirror Testing SystemRawrPHPExcelBlogEngine.NETMicrosoft Biology FoundationCustomer Portal Accelerator for Microsoft Dynamics CRMWindows Azure Command-line Tools for PHP DevelopersShake - C# MakeStyleCop

    Read the article

  • RequestValidation Changes in ASP.NET 4.0

    - by Rick Strahl
    There’s been a change in the way the ValidateRequest attribute on WebForms works in ASP.NET 4.0. I noticed this today while updating a post on my WebLog all of which contain raw HTML and so all pretty much trigger request validation. I recently upgraded this app from ASP.NET 2.0 to 4.0 and it’s now failing to update posts. At first this was difficult to track down because of custom error handling in my app – the custom error handler traps the exception and logs it with only basic error information so the full detail of the error was initially hidden. After some more experimentation in development mode the error that occurs is the typical ASP.NET validate request error (‘A potentially dangerous Request.Form value was detetected…’) which looks like this in ASP.NET 4.0: At first when I got this I was real perplexed as I didn’t read the entire error message and because my page does have: <%@ Page Language="C#" AutoEventWireup="true" CodeBehind="NewEntry.aspx.cs" Inherits="Westwind.WebLog.NewEntry" MasterPageFile="~/App_Templates/Standard/AdminMaster.master" ValidateRequest="false" EnableEventValidation="false" EnableViewState="false" %> WTF? ValidateRequest would seem like it should be enough, but alas in ASP.NET 4.0 apparently that setting alone is no longer enough. Reading the fine print in the error explains that you need to explicitly set the requestValidationMode for the application back to V2.0 in web.config: <httpRuntime executionTimeout="300" requestValidationMode="2.0" /> Kudos for the ASP.NET team for putting up a nice error message that tells me how to fix this problem, but excuse me why the heck would you change this behavior to require an explicit override to an optional and by default disabled page level switch? You’ve just made a relatively simple fix to a solution a nasty morass of hard to discover configuration settings??? The original way this worked was perfectly discoverable via attributes in the page. Now you can set this setting in the page and get completely unexpected behavior and you are required to set what effectively amounts to a backwards compatibility flag in the configuration file. It turns out the real reason for the .config flag is that the request validation behavior has moved from WebForms pipeline down into the entire ASP.NET/IIS request pipeline and is now applied against all requests. Here’s what the breaking changes page from Microsoft says about it: The request validation feature in ASP.NET provides a certain level of default protection against cross-site scripting (XSS) attacks. In previous versions of ASP.NET, request validation was enabled by default. However, it applied only to ASP.NET pages (.aspx files and their class files) and only when those pages were executing. In ASP.NET 4, by default, request validation is enabled for all requests, because it is enabled before the BeginRequest phase of an HTTP request. As a result, request validation applies to requests for all ASP.NET resources, not just .aspx page requests. This includes requests such as Web service calls and custom HTTP handlers. Request validation is also active when custom HTTP modules are reading the contents of an HTTP request. As a result, request validation errors might now occur for requests that previously did not trigger errors. To revert to the behavior of the ASP.NET 2.0 request validation feature, add the following setting in the Web.config file: <httpRuntime requestValidationMode="2.0" /> However, we recommend that you analyze any request validation errors to determine whether existing handlers, modules, or other custom code accesses potentially unsafe HTTP inputs that could be XSS attack vectors. Ok, so ValidateRequest of the form still works as it always has but it’s actually the ASP.NET Event Pipeline, not WebForms that’s throwing the above exception as request validation is applied to every request that hits the pipeline. Creating the runtime override removes the HttpRuntime checking and restores the WebForms only behavior. That fixes my immediate problem but still leaves me wondering especially given the vague wording of the above explanation. One thing that’s missing in the description is above is one important detail: The request validation is applied only to application/x-www-form-urlencoded POST content not to all inbound POST data. When I first read this this freaked me out because it sounds like literally ANY request hitting the pipeline is affected. To make sure this is not really so I created a quick handler: public class Handler1 : IHttpHandler { public void ProcessRequest(HttpContext context) { context.Response.ContentType = "text/plain"; context.Response.Write("Hello World <hr>" + context.Request.Form.ToString()); } public bool IsReusable { get { return false; } } } and called it with Fiddler by posting some XML to the handler using a default form-urlencoded POST content type: and sure enough – hitting the handler also causes the request validation error and 500 server response. Changing the content type to text/xml effectively fixes the problem however, bypassing the request validation filter so Web Services/AJAX handlers and custom modules/handlers that implement custom protocols aren’t affected as long as they work with special input content types. It also looks that multipart encoding does not trigger event validation of the runtime either so this request also works fine: POST http://rasnote/weblog/handler1.ashx HTTP/1.1 Content-Type: multipart/form-data; boundary=------7cf2a327f01ae User-Agent: West Wind Internet Protocols 5.53 Host: rasnote Content-Length: 40 Pragma: no-cache <xml>asdasd</xml>--------7cf2a327f01ae *That* probably should trigger event validation – since it is a potential HTML form submission, but it doesn’t. New Runtime Feature, Global Scope Only? Ok, so request validation is now a runtime feature but sadly it’s a feature that’s scoped to the ASP.NET Runtime – effective scope to the entire running application/app domain. You can still manually force validation using Request.ValidateInput() which gives you the option to do this in code, but that realistically will only work with the requestValidationMode set to V2.0 as well since the 4.0 mode auto-fires before code ever gets a chance to intercept the call. Given all that, the new setting in ASP.NET 4.0 seems to limit options and makes things more difficult and less flexible. Of course Microsoft gets to say ASP.NET is more secure by default because of it but what good is that if you have to turn off this flag the very first time you need to allow one single request that bypasses request validation??? This is really shortsighted design… <sigh>© Rick Strahl, West Wind Technologies, 2005-2010Posted in ASP.NET  

    Read the article

  • Setup and Use SpecFlow BDD with DevExpress XAF

    - by Patrick Liekhus
    Let’s get started with using the SpecFlow BDD syntax for writing tests with the DevExpress XAF EasyTest scripting syntax.  In order for this to work you will need to download and install the prerequisites listed below.  Once they are installed follow the steps outlined below and enjoy. Prerequisites Install the following items: DevExpress eXpress Application Framework (XAF) found here SpecFlow found here Liekhus BDD/XAF Testing library found here Assumptions I am going to assume at this point that you have created your XAF application and have your Module, Win.Module and Win ready for usage.  You should have also set any attributes and/or settings as you see fit. Setup So where to start. Create a new testing project within your solution. I typically call this with a similar naming convention as used by XAF, my project name .FunctionalTests (i.e. AlbumManager.FunctionalTests). Add the following references to your project.  It should look like the reference list below. DevExpress.Data.v11.x DevExpress.Persistent.Base.v11.x DevExpress.Persistent.BaseImpl.v11.x DevExpress.Xpo.v11.2 Liekhus.Testing.BDD.Core Liekhus.Testing.BDD.DevExpress TechTalk.SpecFlow TestExecutor.v11.x (found in %Program Files%\DevExpress 2011.x\eXpressApp Framework\Tools\EasyTest Right click the TestExecutor reference and set the “Copy Local” setting to True.  This forces the TestExecutor executable to be available in the bin directory which is where the EasyTest script will be executed further down in the process. Add an Application Configuration File (app.config) to your test application.  You will need to make a few modifications to have SpecFlow generate Microsoft style unit tests.  First add the section handler for SpecFlow and then set your choice of testing framework.  I prefer MS Tests for my projects. Add the EasyTest configuration file to your project.  Add a new XML file and call it Config.xml. Open the properties window for the Config.xml file and set the “Copy to Ouput Directory” to “Copy Always”. You will setup the Config file according to the specifications of the EasyTest library my mapping to your executable and other settings.  You can find the details for the configuration of EasyTest here.  My file looks like this Create a new folder in your test project called “StepDefinitions”.  Add a new SpecFlow Step Definition file item under the StepDefinitions folder.  I typically call this class StepDefinition.cs. Have your step definition inherit from the Liekhus.Testing.BDD.DevExpress.StepDefinition class.  This will give you the default behaviors for your test in the next section. OK.  Now that we have done this series of steps, we will work on simplifying this.  This is an early preview of this new project and is not fully ready for consumption.  If you would like to experiment with it, please feel free.  Our goals are to make this a installable project on it’s own with it’s own project templates and default settings.  This will be coming in later versions.  Currently this project is in Alpha release. Let’s write our first test Remove the basic test that is created for you. We will not use the default test but rather create our own SpecFlow “Feature” files. Add a new item to your project and select the SpecFlow Feature file under C#. Name your feature file as you do your class files after the test they are performing. Writing a feature file uses the Cucumber syntax of Given… When… Then.  Think of it in these terms.  Givens are the pre-conditions for the test.  The Whens are the actual steps for the test being performed.  The Thens are the verification steps that confirm your test either passed or failed.  All of these steps are generated into a an EasyTest format and executed against your XAF project.  You can find more on the Cucumber syntax by using the Secret Ninja Cucumber Scrolls.  This document has several good styles of tests, plus you can get your fill of Chuck Norris vs Ninjas.  Pretty humorous document but full of great content. My first test is going to test the entry of a new Album into the application and is outlined below. The Feature section at the top is more for your documentation purposes.  Try to be descriptive of the test so that it makes sense to the next person behind you.  The Scenario outline is described in the Ninja Scrolls, but think of it as test template.  You can write one test outline and have multiple datasets (Scenarios) executed against that test.  Here are the steps of my test and their descriptions Given I am starting a new test – tells our test to create a new EasyTest file And (Given) the application is open – tells EasyTest to open our application defined in the Config.xml When I am at the “Albums” screen – tells XAF to navigate to the Albums list view And (When) I click the “New:Album” button – tells XAF to click the New Album button on the ribbon And (When) I enter the following information – tells XAF to find the field on the screen and put the value in that field And (When) I click the “Save and Close” button – tells XAF to click the “Save and Close” button on the detail window Then I verify results as “user” – tells the testing framework to execute the EasyTest as your configured user Once you compile and prepare your tests you should see the following in your Test View.  For each of your CreateNewAlbum lines in your scenarios, you will see a new test ready to execute. From here you will use your testing framework of choice to execute the test.  This in turn will execute the EasyTest framework to call back into your XAF application and test your business application. Again, please remember that this is an early preview and we are still working out the details.  Please let us know if you have any comments/questions/concerns. Thanks and happy testing.

    Read the article

  • Oracle releases Java SE 7 Update 7, and Java SE 6 Update 35

    - by Henrik Stahl
    This morning, Oracle released updates to JDK 6 and 7. For more information on these releases see: Security Alert for CVE-2012-4681 Released Release notes Oracle recommends that users apply these updates as soon as possible. Users of Oracle JRE 6 and 7 for Windows (32-bit) and the recently released JRE 7 for Mac OSX (64-bit) will be updated automatically. For more information see, this blog entry.

    Read the article

  • Project Nashorn Slides & Talks

    - by $utils.escapeXML($entry.author)
    At the Eclipse Demo Camp in Hamburg last week I got asked about resources on Project Nashorn. So, I compiled a quick list:slides from Jim Laskey's JavaOne 2011 talk titled "The Future of JavaScript in the JDK".slides from Bernard Traversat's JavaOne 2011 talk titled "HTLM5 and Java: The Facts and the Myths".slides and video from Jim Laskey's JVM Language Simmit talk titled "Adventures in JSR 292 (Nashorn)".

    Read the article

  • CI Deployment Of Azure Web Roles Using TeamCity

    - by srkirkland
    After recently migrating an important new website to use Windows Azure “Web Roles” I wanted an easier way to deploy new versions to the Azure Staging environment as well as a reliable process to rollback deployments to a certain “known good” source control commit checkpoint.  By configuring our JetBrains’ TeamCity CI server to utilize Windows Azure PowerShell cmdlets to create new automated deployments, I’ll show you how to take control of your Azure publish process. Step 0: Configuring your Azure Project in Visual Studio Before we can start looking at automating the deployment, we should make sure manual deployments from Visual Studio are working properly.  Detailed information for setting up deployments can be found at http://msdn.microsoft.com/en-us/library/windowsazure/ff683672.aspx#PublishAzure or by doing some quick Googling, but the basics are as follows: Install the prerequisite Windows Azure SDK Create an Azure project by right-clicking on your web project and choosing “Add Windows Azure Cloud Service Project” (or by manually adding that project type) Configure your Role and Service Configuration/Definition as desired Right-click on your azure project and choose “Publish,” create a publish profile, and push to your web role You don’t actually have to do step #4 and create a publish profile, but it’s a good exercise to make sure everything is working properly.  Once your Windows Azure project is setup correctly, we are ready to move on to understanding the Azure Publish process. Understanding the Azure Publish Process The actual Windows Azure project is fairly simple at its core—it builds your dependent roles (in our case, a web role) against a specific service and build configuration, and outputs two files: ServiceConfiguration.Cloud.cscfg: This is just the file containing your package configuration info, for example Instance Count, OsFamily, ConnectionString and other Setting information. ProjectName.Azure.cspkg: This is the package file that contains the guts of your deployment, including all deployable files. When you package your Azure project, these two files will be created within the directory ./[ProjectName].Azure/bin/[ConfigName]/app.publish/.  If you want to build your Azure Project from the command line, it’s as simple as calling MSBuild on the “Publish” target: msbuild.exe /target:Publish Windows Azure PowerShell Cmdlets The last pieces of the puzzle that make CI automation possible are the Azure PowerShell Cmdlets (http://msdn.microsoft.com/en-us/library/windowsazure/jj156055.aspx).  These cmdlets are what will let us create deployments without Visual Studio or other user intervention. Preparing TeamCity for Azure Deployments Now we are ready to get our TeamCity server setup so it can build and deploy Windows Azure projects, which we now know requires the Azure SDK and the Windows Azure PowerShell Cmdlets. Installing the Azure SDK is easy enough, just go to https://www.windowsazure.com/en-us/develop/net/ and click “Install” Once this SDK is installed, I recommend running a test build to make sure your project is building correctly.  You’ll want to setup your build step using MSBuild with the “Publish” target against your solution file.  Mine looks like this: Assuming the build was successful, you will now have the two *.cspkg and *cscfg files within your build directory.  If the build was red (failed), take a look at the build logs and keep an eye out for “unsupported project type” or other build errors, which will need to be addressed before the CI deployment can be completed. With a successful build we are now ready to install and configure the Windows Azure PowerShell Cmdlets: Follow the instructions at http://msdn.microsoft.com/en-us/library/windowsazure/jj554332 to install the Cmdlets and configure PowerShell After installing the Cmdlets, you’ll need to get your Azure Subscription Info using the Get-AzurePublishSettingsFile command. Store the resulting *.publishsettings file somewhere you can get to easily, like C:\TeamCity, because you will need to reference it later from your deploy script. Scripting the CI Deploy Process Now that the cmdlets are installed on our TeamCity server, we are ready to script the actual deployment using a TeamCity “PowerShell” build runner.  Before we look at any code, here’s a breakdown of our deployment algorithm: Setup your variables, including the location of the *.cspkg and *cscfg files produced in the earlier MSBuild step (remember, the folder is something like [ProjectName].Azure/bin/[ConfigName]/app.publish/ Import the Windows Azure PowerShell Cmdlets Import and set your Azure Subscription information (this is basically your authentication/authorization step, so protect your settings file Now look for a current deployment, and if you find one Upgrade it, else Create a new deployment Pretty simple and straightforward.  Now let’s look at the code (also available as a gist here: https://gist.github.com/3694398): $subscription = "[Your Subscription Name]" $service = "[Your Azure Service Name]" $slot = "staging" #staging or production $package = "[ProjectName]\bin\[BuildConfigName]\app.publish\[ProjectName].cspkg" $configuration = "[ProjectName]\bin\[BuildConfigName]\app.publish\ServiceConfiguration.Cloud.cscfg" $timeStampFormat = "g" $deploymentLabel = "ContinuousDeploy to $service v%build.number%"   Write-Output "Running Azure Imports" Import-Module "C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\Azure\*.psd1" Import-AzurePublishSettingsFile "C:\TeamCity\[PSFileName].publishsettings" Set-AzureSubscription -CurrentStorageAccount $service -SubscriptionName $subscription   function Publish(){ $deployment = Get-AzureDeployment -ServiceName $service -Slot $slot -ErrorVariable a -ErrorAction silentlycontinue   if ($a[0] -ne $null) { Write-Output "$(Get-Date -f $timeStampFormat) - No deployment is detected. Creating a new deployment. " } if ($deployment.Name -ne $null) { #Update deployment inplace (usually faster, cheaper, won't destroy VIP) Write-Output "$(Get-Date -f $timeStampFormat) - Deployment exists in $servicename. Upgrading deployment." UpgradeDeployment } else { CreateNewDeployment } }   function CreateNewDeployment() { write-progress -id 3 -activity "Creating New Deployment" -Status "In progress" Write-Output "$(Get-Date -f $timeStampFormat) - Creating New Deployment: In progress"   $opstat = New-AzureDeployment -Slot $slot -Package $package -Configuration $configuration -label $deploymentLabel -ServiceName $service   $completeDeployment = Get-AzureDeployment -ServiceName $service -Slot $slot $completeDeploymentID = $completeDeployment.deploymentid   write-progress -id 3 -activity "Creating New Deployment" -completed -Status "Complete" Write-Output "$(Get-Date -f $timeStampFormat) - Creating New Deployment: Complete, Deployment ID: $completeDeploymentID" }   function UpgradeDeployment() { write-progress -id 3 -activity "Upgrading Deployment" -Status "In progress" Write-Output "$(Get-Date -f $timeStampFormat) - Upgrading Deployment: In progress"   # perform Update-Deployment $setdeployment = Set-AzureDeployment -Upgrade -Slot $slot -Package $package -Configuration $configuration -label $deploymentLabel -ServiceName $service -Force   $completeDeployment = Get-AzureDeployment -ServiceName $service -Slot $slot $completeDeploymentID = $completeDeployment.deploymentid   write-progress -id 3 -activity "Upgrading Deployment" -completed -Status "Complete" Write-Output "$(Get-Date -f $timeStampFormat) - Upgrading Deployment: Complete, Deployment ID: $completeDeploymentID" }   Write-Output "Create Azure Deployment" Publish   Creating the TeamCity Build Step The only thing left is to create a second build step, after your MSBuild “Publish” step, with the build runner type “PowerShell”.  Then set your script to “Source Code,” the script execution mode to “Put script into PowerShell stdin with “-Command” arguments” and then copy/paste in the above script (replacing the placeholder sections with your values).  This should look like the following:   Wrap Up After combining the MSBuild /target:Publish step (which creates the necessary Windows Azure *.cspkg and *.cscfg files) and a PowerShell script step which utilizes the Azure PowerShell Cmdlets, we have a fully deployable build configuration in TeamCity.  You can configure this step to run whenever you’d like using build triggers – for example, you could even deploy whenever a new master branch deploy comes in and passes all required tests. In the script I’ve hardcoded that every deployment goes to the Staging environment on Azure, but you could deploy straight to Production if you want to, or even setup a deployment configuration variable and set it as desired. After your TeamCity Build Configuration is complete, you’ll see something that looks like this: Whenever you click the “Run” button, all of your code will be compiled, published, and deployed to Windows Azure! One additional enormous benefit of automating the process this way is that you can easily deploy any specific source control changeset by clicking the little ellipsis button next to "Run.”  This will bring up a dialog like the one below, where you can select the last change to use for your deployment.  Since Azure Web Role deployments don’t have any rollback functionality, this is a critical feature.   Enjoy!

    Read the article

  • OAM11gR2: Enabling SSL in the Data Store

    - by Ekta Malik
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Enabling SSL in the Data Store of OAM11gR2 comprises of the below mentioned steps. Import the certificate/s required for establishing the trust with the Store(backend) in the keystore(cacerts) on the machine hosting OAM's Weblogic Admin server Restart the Weblogic Admin server Specify the <Hostname>:<SSL port> in the "Location" field of the Data Store and select the "Enable SSL" checkbox Pre-requisite:- Certificate/s to be imported are available for import Data Store has already been created using OAM admin console and the connection to the store is successful on non-SSL port( though one can always create a Data Store with SSL settings on the first go) Steps for importing the certificate/s:- One can use the keytool utility that comes bundled with JDK to import the certificate. The step for importing the certificate would be same for self-signed and third party certificates (like VeriSign) $JAVA_HOME/bin/keytool -import -v -noprompt -trustcacerts -alias <aliasname> -file <Path to the certificate file> -keystore $JAVA_HOME/jre/lib/security/cacerts Here $JAVA_HOME refers to the path of JDK install directory Note: In case multiple certificates are required for establishing the trust, import all those certificates using the same keytool command mentioned above  One can verify the import of the certificate/s by using the below mentioned command $JAVA_HOME/bin/keytool -list -alias <aliasname>-v -keystore $JAVA_HOME/jre/lib/security/cacerts When the trust gets established for the SSL communication, specifying the SSL specific settings in the Data Store (via OAM admin console) wouldn't result into the previously seen error (when Certificates are yet to be imported) and the "Test Connection" would be successful.

    Read the article

  • 14+ WordPress Portfolio Themes

    - by Edward
    There are various portfolio themes for WordPress out there, with this collection we are trying to help you choose the best one. These themes can be used to create any type of personal, photography, art or corporate portfolio. Display 3 in 1 Display 3 in 1 – Business & Portfolio WordPress Theme. Features a fantastic 3D Image slideshow that can be controlled from your backend with a custom tool. The Theme has a huge wordpress custom backend (8 additional Admin Pages) that make customization of the Theme easy for those who dont know much about coding or wordpress. Price: $40 View Demo Download DeepFocus Tempting features such as automatic separation of blog and portfolio content by template, publishing of most important information on homepage, styles to choose from and many more such features. It also provides for page templates for blog, portfolio, blog archive, tags etc. It has the best feature that helps you to manage everything from one place. Price: $39 (Package includes more than 55 themes) View Demo Download SimplePress Simple, yet awesome. One of the best portfolio theme. Price: $39 (Package includes more than 55 themes) View Demo Download Graphix Graphix is one of best word press portfolio themes. It is most suited to aspiring designers, developers, artists and photographers who’d like a framework theme, which has a great-looking portfolio with a feature-rich blog. It has theme option page, 5-color style, SEO option, featured content blocks, drop down multi-level menu, social profile link custom widgets, custom post, custom page template etc. Price: $69 Single & $149 Developer Package View Demo Download Bizznizz It boasts of many features such as custom homepage, custom post types, custom widgets, portfolio templates, alternative styles and many more. View Demo Download Showtime Ultimate WordPress Theme for you to create your web portfolio, It has 3 different styles for you to choose from. Price: $40 View Demo Download Montana WP Horizontal Portfolio Theme Montana Theme – WP Horizontal Portfolio Theme, best suited for creative studios to showcase design, photography, illustration, paintings and art. Price: $30 View Demo Download OverALL OverALL Premium WordPress Blog & Portfolio Theme, is low priced & has amazing tons of features. Price: $17 View Demo Download Habitat Habitat – Blog and Portfolio Theme. Unique Portfolio Sorting/Filtering with a custom jQuery script (each entry supports multiple images or a video) Multiple Featured Images for each post to generate individual Slideshows per Post, or the option to directly embed video content from youtube, vimeo, hulu etc. Price: $35 View Demo Download Fresh Folio Fresh Folio from WooThemes, can be used as both portfolio and a premium WordPress theme. The theme is a remix of the Fresh News Theme and Proud Folio Theme which combines all the best elements of the respective blog and portfolio style themes. View Demo Download Fresh Folio Features: Can be used to create an impressive portfolio. 7 diverse theme styles to choose from (default, blue, red, grunge light, grunge floral, antique, blue creamer, nightlife) The template will automatically (visually) separate your blog & portfolio content, making this an amazing theme for aspiring designers, developers, artists, photographers etc. Unique page templates types for the portfolio, blog, blog archives, tags & search results. Integrated Theme Options (for WordPress) to tweak the layout, colour scheme etc. for the theme Optional Automatic Image Resize, which is used to dynamically create the thumbnails and featured images Includes Widget enabled Sidebars. eGallery eGallery is a theme made to transform your wordpress blog into a fully functional online portfolio. Theme is perfectly designed to emphasize the artwork you choose to showcase. The design has been greatly enhanced using javascript, and is easy to implement. Price: $39 (Package includes more than 55 themes) View Demo Download ProudFolio ProudFolio is a portfolio premium WordPress theme from Woo Themes. The theme is for designers, developers, artists and photographers who would like a showcase theme which would depict as a portfolio and also serves a purpose of blog. ProudFolio puts a strong emphasis on the portfolio pieces, allowing for decent-sized thumbnails, huge fullscreen views via Lightbox, and full details on the single page. The theme file also contains a choice of three different background images and color schemes. Price: $70 Single $150 Developer License View Demo Download Features: The template will automatically (visually) separate your blog & portfolio content. An unique homepage layout, which publishes only the most important information; Unique page templates for the portfolio, blog, blog archives, tags & search results. Integrated Theme Options (for WordPress) to tweak the layout, colour scheme etc. for the theme; Built-in video panel, which you can use to publish any web-based Flash videos; Automatic Image Resize, which is used to dynamically create the thumbnails and featured images; Custom Page Templates for Archives, Sitemap & Image Gallery; Built-in Gravatar Support for Authors & Comments; Integrated Banner Management script to display randomized banner ads of your choice site-wide; Pretty drop down navigation everywhere; and Widget Enabled Sidebars. Porftolio WordPress Theme A FREE wordpress theme designed for web portfolios and (for now) just for web portfolios. It is coming with an Administrative Panel from where you can edit the head quote text, you can edit all theme colors, font families, font sizes and you can fill a curriculum vitae and display it into a special page. Theme demo and download can be found here Viz | Biz Viz | Biz is a premium WordPress photo gallery and portfolio theme designed specifically for photographers, graphic designers and web designers who want to display their creative work online, market their services, as well as have a typical text blog, using the power and flexibility of WordPress. It is priced for $79.95. Theme Features: Premium quality portfolio template Custom logo uploader to replace the standard graphic with your own unique look from the WP Dashboard Integrated blog component (front images are custom fields and thumbnails, but you can also have a typical blog) Four tabbed feature areas (About Me, Services, Recent Posts, and Tags) Two home page feature photos (You choose which photos to feature using a WP category) Manage your online portfolio through the WordPress CMS Crop two sizes of your work: One for the front page thumbnails and another full size version and upload to WP Search engine optimized. Related posts:14 WordPress Photo Blog & Portfolio Themes 6 PhotoBlog Portfolio WordPress Themes Professional WordPress Business Themes

    Read the article

  • Improving Manageability of Virtual Environments

    - by Jeff Victor
    Boot Environments for Solaris 10 Branded Zones Until recently, Solaris 10 Branded Zones on Solaris 11 suffered one notable regression: Live Upgrade did not work. The individual packaging and patching tools work correctly, but the ability to upgrade Solaris while the production workload continued running did not exist. A recent Solaris 11 SRU (Solaris 11.1 SRU 6.4) restored most of that functionality, although with a slightly different concept, different commands, and without all of the feature details. This new method gives you the ability to create and manage multiple boot environments (BEs) for a Solaris 10 Branded Zone, and modify the active or any inactive BE, and to do so while the production workload continues to run. Background In case you are new to Solaris: Solaris includes a set of features that enables you to create a bootable Solaris image, called a Boot Environment (BE). This newly created image can be modified while the original BE is still running your workload(s). There are many benefits, including improved uptime and the ability to reboot into (or downgrade to) an older BE if a newer one has a problem. In Solaris 10 this set of features was named Live Upgrade. Solaris 11 applies the same basic concepts to the new packaging system (IPS) but there isn't a specific name for the feature set. The features are simply part of IPS. Solaris 11 Boot Environments are not discussed in this blog entry. Although a Solaris 10 system can have multiple BEs, until recently a Solaris 10 Branded Zone (BZ) in a Solaris 11 system did not have this ability. This limitation was addressed recently, and that enhancement is the subject of this blog entry. This new implementation uses two concepts. The first is the use of a ZFS clone for each BE. This makes it very easy to create a BE, or many BEs. This is a distinct advantage over the Live Upgrade feature set in Solaris 10, which had a practical limitation of two BEs on a system, when using UFS. The second new concept is a very simple mechanism to indicate the BE that should be booted: a ZFS property. The new ZFS property is named com.oracle.zones.solaris10:activebe (isn't that creative? ). It's important to note that the property is inherited from the original BE's file system to any BEs you create. In other words, all BEs in one zone have the same value for that property. When the (Solaris 11) global zone boots the Solaris 10 BZ, it boots the BE that has the name that is stored in the activebe property. Here is a quick summary of the actions you can use to manage these BEs: To create a BE: Create a ZFS clone of the zone's root dataset To activate a BE: Set the ZFS property of the root dataset to indicate the BE To add a package or patch to an inactive BE: Mount the inactive BE Add packages or patches to it Unmount the inactive BE To list the available BEs: Use the "zfs list" command. To destroy a BE: Use the "zfs destroy" command. Preparation Before you can use the new features, you will need a Solaris 10 BZ on a Solaris 11 system. You can use these three steps - on a real Solaris 11.1 server or in a VirtualBox guest running Solaris 11.1 - to create a Solaris 10 BZ. The Solaris 11.1 environment must be at SRU 6.4 or newer. Create a flash archive on the Solaris 10 system s10# flarcreate -n s10-system /net/zones/archives/s10-system.flar Configure the Solaris 10 BZ on the Solaris 11 system s11# zonecfg -z s10z Use 'create' to begin configuring a new zone. zonecfg:s10z create -t SYSsolaris10 zonecfg:s10z set zonepath=/zones/s10z zonecfg:s10z exit s11# zoneadm list -cv ID NAME STATUS PATH BRAND IP 0 global running / solaris shared - s10z configured /zones/s10z solaris10 excl Install the zone from the flash archive s11# zoneadm -z s10z install -a /net/zones/archives/s10-system.flar -p You can find more information about the migration of Solaris 10 environments to Solaris 10 Branded Zones in the documentation. The rest of this blog entry demonstrates the commands you can use to accomplish the aforementioned actions related to BEs. New features in action Note that the demonstration of the commands occurs in the Solaris 10 BZ, as indicated by the shell prompt "s10z# ". Many of these commands can be performed in the global zone instead, if you prefer. If you perform them in the global zone, you must change the ZFS file system names. Create The only complicated action is the creation of a BE. In the Solaris 10 BZ, create a new "boot environment" - a ZFS clone. You can assign any name to the final portion of the clone's name, as long as it meets the requirements for a ZFS file system name. s10z# zfs snapshot rpool/ROOT/zbe-0@snap s10z# zfs clone -o mountpoint=/ -o canmount=noauto rpool/ROOT/zbe-0@snap rpool/ROOT/newBE cannot mount 'rpool/ROOT/newBE' on '/': directory is not empty filesystem successfully created, but not mounted You can safely ignore that message: we already know that / is not empty! We have merely told ZFS that the default mountpoint for the clone is the root directory. List the available BEs and active BE Because each BE is represented by a clone of the rpool/ROOT dataset, listing the BEs is as simple as listing the clones. s10z# zfs list -r rpool/ROOT NAME USED AVAIL REFER MOUNTPOINT rpool/ROOT 3.55G 42.9G 31K legacy rpool/ROOT/zbe-0 1K 42.9G 3.55G / rpool/ROOT/newBE 3.55G 42.9G 3.55G / The output shows that two BEs exist. Their names are "zbe-0" and "newBE". You can tell Solaris that one particular BE should be used when the zone next boots by using a ZFS property. Its name is com.oracle.zones.solaris10:activebe. The value of that property is the name of the clone that contains the BE that should be booted. s10z# zfs get com.oracle.zones.solaris10:activebe rpool/ROOT NAME PROPERTY VALUE SOURCE rpool/ROOT com.oracle.zones.solaris10:activebe zbe-0 local Change the active BE When you want to change the BE that will be booted next time, you can just change the activebe property on the rpool/ROOT dataset. s10z# zfs get com.oracle.zones.solaris10:activebe rpool/ROOT NAME PROPERTY VALUE SOURCE rpool/ROOT com.oracle.zones.solaris10:activebe zbe-0 local s10z# zfs set com.oracle.zones.solaris10:activebe=newBE rpool/ROOT s10z# zfs get com.oracle.zones.solaris10:activebe rpool/ROOT NAME PROPERTY VALUE SOURCE rpool/ROOT com.oracle.zones.solaris10:activebe newBE local s10z# shutdown -y -g0 -i6 After the zone has rebooted: s10z# zfs get com.oracle.zones.solaris10:activebe rpool/ROOT rpool/ROOT com.oracle.zones.solaris10:activebe newBE local s10z# zfs mount rpool/ROOT/newBE / rpool/export /export rpool/export/home /export/home rpool /rpool Mount the original BE to see that it's still there. s10z# zfs mount -o mountpoint=/mnt rpool/ROOT/zbe-0 s10z# ls /mnt Desktop export platform Documents export.backup.20130607T214951Z proc S10Flar home rpool TT_DB kernel sbin bin lib system boot lost+found tmp cdrom mnt usr dev net var etc opt Patch an inactive BE At this point, you can modify the original BE. If you would prefer to modify the new BE, you can restore the original value to the activebe property and reboot, and then mount the new BE to /mnt (or another empty directory) and modify it. Let's mount the original BE so we can modify it. (The first command is only needed if you haven't already mounted that BE.) s10z# zfs mount -o mountpoint=/mnt rpool/ROOT/zbe-0 s10z# patchadd -R /mnt -M /var/sadm/spool 104945-02 Note that the typical usage will be: Create a BE Mount the new (inactive) BE Use the package and patch tools to update the new BE Unmount the new BE Reboot Delete an inactive BE ZFS clones are children of their parent file systems. In order to destroy the parent, you must first "promote" the child. This reverses the parent-child relationship. (For more information on this, see the documentation.) The original rpool/ROOT file system is the parent of the clones that you create as BEs. In order to destroy an earlier BE that is that parent of other BEs, you must first promote one of the child BEs to be the ZFS parent. Only then can you destroy the original BE. Fortunately, this is easier to do than to explain: s10z# zfs promote rpool/ROOT/newBE s10z# zfs destroy rpool/ROOT/zbe-0 s10z# zfs list -r rpool/ROOT NAME USED AVAIL REFER MOUNTPOINT rpool/ROOT 3.56G 269G 31K legacy rpool/ROOT/newBE 3.56G 269G 3.55G / Documentation This feature is so new, it is not yet described in the Solaris 11 documentation. However, MOS note 1558773.1 offers some details. Conclusion With this new feature, you can add and patch packages to boot environments of a Solaris 10 Branded Zone. This ability improves the manageability of these zones, and makes their use more practical. It also means that you can use the existing P2V tools with earlier Solaris 10 updates, and modify the environments after they become Solaris 10 Branded Zones.

    Read the article

  • Saucy upgrade causes problem in Eclipse

    - by Ralph Rassweiler
    Does anybody has been through this: Using Ubuntu 13.10 (with JDK 1.7.0_45). Download and uncompress Eclipse Kepler for Java EE Developers. The software menus are messed up. Other softwares i didn't notice similar problem. When i click any menu of eclipse, the drop-down seems to be "cutted". Somethimes the drop-down shows, but the options are invisible. I try Eclipse Indigo, the same problem occurs.

    Read the article

  • Java DB talks at JavaOne 2012

    - by kah
    It's soon time for JavaOne again in San Francisco, and Java DB is represented this year too. Dag Wanvik will give an introductory talk on Java DB on Tuesday, October 2 at 10:00: CON5141 - Java DB in JDK 7: A Free, Feature-Rich, Embeddable SQL Database Rick Hillegas and Noel Poore will discuss how to use Java DB on embedded devices in their talk on Thursday, October 4 at 14:00: CON6684 - Data Storage for Embedded Middleware Mark your calendars! :)

    Read the article

  • Saucy upgrade causes menu problem in Eclipse

    - by Ralph Rassweiler
    Has anyone been through this: Using Ubuntu 13.10 (with JDK 1.7.0_45). Download and uncompress Eclipse Kepler for Java EE Developers. The software menus are messed up. I didn't notice similar problems in other software. When I click any menu of eclipse, the drop-down seems to be "cut". Sometimes the drop-down shows, but the options are invisible. I tried Eclipse Indigo, but the same problem occurs.

    Read the article

  • Source-control 'wet-work'?

    - by Phil Factor
    When a design or creative work is flawed beyond remedy, it is often best to destroy it and start again. The other day, I lost the code to a long and intricate SQL batch I was working on. I’d thought it was impossible, but it happened. With all the technology around that is designed to prevent this occurring, this sort of accident has become a rare event.  If it weren’t for a deranged laptop, and my distraction, the code wouldn’t have been lost this time.  As always, I sighed, had a soothing cup of tea, and typed it all in again.  The new code I hastily tapped in  was much better: I’d held in my head the essence of how the code should work rather than the details: I now knew for certain  the start point, the end, and how it should be achieved. Instantly the detritus of half-baked thoughts fell away and I was able to write logical code that performed better.  Because I could work so quickly, I was able to hold the details of all the columns and variables in my head, and the dynamics of the flow of data. It was, in fact, easier and quicker to start from scratch rather than tidy up and refactor the existing code with its inevitable fumbling and half-baked ideas. What a shame that technology is now so good that developers rarely experience the cleansing shock of losing one’s code and having to rewrite it from scratch.  If you’ve never accidentally lost  your code, then it is worth doing it deliberately once for the experience. Creative people have, until Technology mistakenly prevented it, torn up their drafts or sketches, threw them in the bin, and started again from scratch.  Leonardo’s obsessive reworking of the Mona Lisa was renowned because it was so unusual:  Most artists have been utterly ruthless in destroying work that didn’t quite make it. Authors are particularly keen on writing afresh, and the results are generally positive. Lawrence of Arabia actually lost the entire 250,000 word manuscript of ‘The Seven Pillars of Wisdom’ by accidentally leaving it on a train at Reading station, before rewriting a much better version.  Now, any writer or artist is seduced by technology into altering or refining their work rather than casting it dramatically in the bin or setting a light to it on a bonfire, and rewriting it from the blank page.  It is easy to pick away at a flawed work, but the real creative process is far more brutal. Once, many years ago whilst running a software house that supplied commercial software to local businesses, I’d been supervising an accounting system for a farming cooperative. No packaged system met their needs, and it was all hand-cut code.  For us, it represented a breakthrough as it was for a government organisation, and success would guarantee more contracts. As you’ve probably guessed, the code got mangled in a disk crash just a week before the deadline for delivery, and the many backups all proved to be entirely corrupted by a faulty tape drive.  There were some fragments left on individual machines, but they were all of different versions.  The developers were in despair.  Strangely, I managed to re-write the bulk of a three-month project in a manic and caffeine-soaked weekend.  Sure, that elegant universally-applicable input-form routine was‘nt quite so elegant, but it didn’t really need to be as we knew what forms it needed to support.  Yes, the code lacked architectural elegance and reusability. By dawn on Monday, the application passed its integration tests. The developers rose to the occasion after I’d collapsed, and tidied up what I’d done, though they were reproachful that some of the style and elegance had gone out of the application. By the delivery date, we were able to install it. It was a smaller, faster application than the beta they’d seen and the user-interface had a new, rather Spartan, appearance that we swore was done to conform to the latest in user-interface guidelines. (we switched to Helvetica font to look more ‘Bauhaus’ ). The client was so delighted that he forgave the new bugs that had crept in. I still have the disk that crashed, up in the attic. In IT, we have had mixed experiences from complete re-writes. Lotus 123 never really recovered from a complete rewrite from assembler into C, Borland made the mistake with Arago and Quattro Pro  and Netscape’s complete rewrite of their Navigator 4 browser was a white-knuckle ride. In all cases, the decision to rewrite was a result of extreme circumstances where no other course of action seemed possible.   The rewrite didn’t come out of the blue. I prefer to remember the rewrite of Minix by young Linus Torvalds, or the rewrite of Bitkeeper by a slightly older Linus.  The rewrite of CP/M didn’t do too badly either, did it? Come to think of it, the guy who decided to rewrite the windowing system of the Xerox Star never regretted the decision. I’ll agree that one should often resist calls for a rewrite. One of the worst habits of the more inexperienced programmer is to denigrate whatever code he or she inherits, and then call loudly for a complete rewrite. They are buoyed up by the mistaken belief that they can do better. This, however, is a different psychological phenomenon, more related to the idea of some motorcyclists that they are operating on infinite lives, or the occasional squaddies that if they charge the machine-guns determinedly enough all will be well. Grim experience brings out the humility in any experienced programmer.  I’m referring to quite different circumstances here. Where a team knows the requirements perfectly, are of one mind on methodology and coding standards, and they already have a solution, then what is wrong with considering  a complete rewrite? Rewrites are so painful in the early stages, until that point where one realises the payoff, that even I quail at the thought. One needs a natural disaster to push one over the edge. The trouble is that source-control systems, and disaster recovery systems, are just too good nowadays.   If I were to lose this draft of this very blog post, I know I’d rewrite it much better. However, if you read this, you’ll know I didn’t have the nerve to delete it and start again.  There was a time that one prayed that unreliable hardware would deliver you from an unmaintainable mess of a codebase, but now technology has made us almost entirely immune to such a merciful act of God. An old friend of mine with long experience in the software industry has long had the idea of the ‘source-control wet-work’,  where one hires a malicious hacker in some wild eastern country to hack into one’s own  source control system to destroy all trace of the source to an application. Alas, backup systems are just too good to make this any more than a pipedream. Somehow, it would be difficult to promote the idea. As an alternative, could one construct a source control system that, on doing all the code-quality metrics, would systematically destroy all trace of source code that failed the quality test? Alas, I can’t see many managers buying into the idea. In reading the full story of the near-loss of Toy Story 2, it set me thinking. It turned out that the lucky restoration of the code wasn’t the happy ending one first imagined it to be, because they eventually came to the conclusion that the plot was fundamentally flawed and it all had to be rewritten anyway.  Was this an early  case of the ‘source-control wet-job’?’ It is very hard nowadays to do a rapid U-turn in a development project because we are far too prone to cling to our existing source-code.

    Read the article

  • Source-control 'wet-work'?

    - by Phil Factor
    When a design or creative work is flawed beyond remedy, it is often best to destroy it and start again. The other day, I lost the code to a long and intricate SQL batch I was working on. I’d thought it was impossible, but it happened. With all the technology around that is designed to prevent this occurring, this sort of accident has become a rare event.  If it weren’t for a deranged laptop, and my distraction, the code wouldn’t have been lost this time.  As always, I sighed, had a soothing cup of tea, and typed it all in again.  The new code I hastily tapped in  was much better: I’d held in my head the essence of how the code should work rather than the details: I now knew for certain  the start point, the end, and how it should be achieved. Instantly the detritus of half-baked thoughts fell away and I was able to write logical code that performed better.  Because I could work so quickly, I was able to hold the details of all the columns and variables in my head, and the dynamics of the flow of data. It was, in fact, easier and quicker to start from scratch rather than tidy up and refactor the existing code with its inevitable fumbling and half-baked ideas. What a shame that technology is now so good that developers rarely experience the cleansing shock of losing one’s code and having to rewrite it from scratch.  If you’ve never accidentally lost  your code, then it is worth doing it deliberately once for the experience. Creative people have, until Technology mistakenly prevented it, torn up their drafts or sketches, threw them in the bin, and started again from scratch.  Leonardo’s obsessive reworking of the Mona Lisa was renowned because it was so unusual:  Most artists have been utterly ruthless in destroying work that didn’t quite make it. Authors are particularly keen on writing afresh, and the results are generally positive. Lawrence of Arabia actually lost the entire 250,000 word manuscript of ‘The Seven Pillars of Wisdom’ by accidentally leaving it on a train at Reading station, before rewriting a much better version.  Now, any writer or artist is seduced by technology into altering or refining their work rather than casting it dramatically in the bin or setting a light to it on a bonfire, and rewriting it from the blank page.  It is easy to pick away at a flawed work, but the real creative process is far more brutal. Once, many years ago whilst running a software house that supplied commercial software to local businesses, I’d been supervising an accounting system for a farming cooperative. No packaged system met their needs, and it was all hand-cut code.  For us, it represented a breakthrough as it was for a government organisation, and success would guarantee more contracts. As you’ve probably guessed, the code got mangled in a disk crash just a week before the deadline for delivery, and the many backups all proved to be entirely corrupted by a faulty tape drive.  There were some fragments left on individual machines, but they were all of different versions.  The developers were in despair.  Strangely, I managed to re-write the bulk of a three-month project in a manic and caffeine-soaked weekend.  Sure, that elegant universally-applicable input-form routine was‘nt quite so elegant, but it didn’t really need to be as we knew what forms it needed to support.  Yes, the code lacked architectural elegance and reusability. By dawn on Monday, the application passed its integration tests. The developers rose to the occasion after I’d collapsed, and tidied up what I’d done, though they were reproachful that some of the style and elegance had gone out of the application. By the delivery date, we were able to install it. It was a smaller, faster application than the beta they’d seen and the user-interface had a new, rather Spartan, appearance that we swore was done to conform to the latest in user-interface guidelines. (we switched to Helvetica font to look more ‘Bauhaus’ ). The client was so delighted that he forgave the new bugs that had crept in. I still have the disk that crashed, up in the attic. In IT, we have had mixed experiences from complete re-writes. Lotus 123 never really recovered from a complete rewrite from assembler into C, Borland made the mistake with Arago and Quattro Pro  and Netscape’s complete rewrite of their Navigator 4 browser was a white-knuckle ride. In all cases, the decision to rewrite was a result of extreme circumstances where no other course of action seemed possible.   The rewrite didn’t come out of the blue. I prefer to remember the rewrite of Minix by young Linus Torvalds, or the rewrite of Bitkeeper by a slightly older Linus.  The rewrite of CP/M didn’t do too badly either, did it? Come to think of it, the guy who decided to rewrite the windowing system of the Xerox Star never regretted the decision. I’ll agree that one should often resist calls for a rewrite. One of the worst habits of the more inexperienced programmer is to denigrate whatever code he or she inherits, and then call loudly for a complete rewrite. They are buoyed up by the mistaken belief that they can do better. This, however, is a different psychological phenomenon, more related to the idea of some motorcyclists that they are operating on infinite lives, or the occasional squaddies that if they charge the machine-guns determinedly enough all will be well. Grim experience brings out the humility in any experienced programmer.  I’m referring to quite different circumstances here. Where a team knows the requirements perfectly, are of one mind on methodology and coding standards, and they already have a solution, then what is wrong with considering  a complete rewrite? Rewrites are so painful in the early stages, until that point where one realises the payoff, that even I quail at the thought. One needs a natural disaster to push one over the edge. The trouble is that source-control systems, and disaster recovery systems, are just too good nowadays.   If I were to lose this draft of this very blog post, I know I’d rewrite it much better. However, if you read this, you’ll know I didn’t have the nerve to delete it and start again.  There was a time that one prayed that unreliable hardware would deliver you from an unmaintainable mess of a codebase, but now technology has made us almost entirely immune to such a merciful act of God. An old friend of mine with long experience in the software industry has long had the idea of the ‘source-control wet-work’,  where one hires a malicious hacker in some wild eastern country to hack into one’s own  source control system to destroy all trace of the source to an application. Alas, backup systems are just too good to make this any more than a pipedream. Somehow, it would be difficult to promote the idea. As an alternative, could one construct a source control system that, on doing all the code-quality metrics, would systematically destroy all trace of source code that failed the quality test? Alas, I can’t see many managers buying into the idea. In reading the full story of the near-loss of Toy Story 2, it set me thinking. It turned out that the lucky restoration of the code wasn’t the happy ending one first imagined it to be, because they eventually came to the conclusion that the plot was fundamentally flawed and it all had to be rewritten anyway.  Was this an early  case of the ‘source-control wet-job’?’ It is very hard nowadays to do a rapid U-turn in a development project because we are far too prone to cling to our existing source-code.

    Read the article

  • ubuntu stuck in a login loop after editing profile file

    - by varunit
    I'm stuck in a login loop now. What I did was edited /etc/profile as root and added the following line export PATH = /opt/my jdk 7 path/bin:$PATH After logging out and tried to login, I cannot I have tried booting in recovery mode, entered root prompt and tried to edit the file in vi but it always opens in read only mode and hence cannot be saved. I just need a way to delete that line and boot into ubuntu again. Please help me out guys..

    Read the article

  • Oracle Sun Java Roadshow - Budapest

    - by peter.nagy
    Még a Sun Magyarország gondozásában, akik korábban is vitték ezen rendezvényeket újabb Java Roadshow esemény lesz. A sok külföldi eloado mellett hazai is lesz. A témák pedig meglehetosen aktuálisak: -Java at Oracle -Java in Embedded (SE, ME) -Java For Business -What is New in Java - Java FX, JDK 7 Szóval tessék jönni és meghallgatni, megvitatni. Logisztika: 2010. Május 20. 08:30 - 16:30 Ramada Plaza Budapest Hotel (1036 Árpád fejedelem útja 94.)

    Read the article

  • Configuring the iPlanet as web tier for Oracle WebCenter Content (UCM)

    - by Adao Junior
    If you are looking for configure the iPlanet as Web server/proxy to use with the Oracle WebCenter Content, you probably won’t found an specific documentation for that or will found some old complex notes related to the old 10gR3. This post will help you out with few simple steps. That’s the diagram of the test scenario, considering that you will deploy in production in an cluster environment. First you need the software, for our scenario you will need: - Oracle iPlanet Web Server 7.0.15+ (Installed) - Oracle WebCenter Content 11gR1 PS5 (Installed) - Oracle WebLogic Web Server Plugins 11g (1.1) - Supported JDK (Using Oracle Java JDK 7u4 for the test) - Certified Client OS - Certified Server OS (Using Oracle Solaris 11 for the test) - Certified Database (Using Oracle Database 11.2.0.3 for the test) Then the configuration: - Download the latest plugin: http://www.oracle.com/technetwork/middleware/ias/downloads/wls-plugins-096117.html - Extract the WLSPlugin11g-iPlanet7.0 in some folder, like <iPlanet_Home>/plugins/wls11 - Include the plugin reference to the magnus.conf: If Unix (Solaris or Linux), include the line: Init fn="load-modules" shlib="/apps/oracle/WebServer7/plugins/wls11/lib/mod_wl.so" If Windows, Include the line:        Init fn="load-modules" shlib="D:\\oracle\\WebServer7\\plugins\\wls11\\lib\\mod_wl.dll" - Include the proxy reference to the obj.conf of each instance: <Object name="weblogic" ppath="*/cs/*"> Service fn="wl-proxy" WebLogicCluster="wcc-node1:16201,wcc-node2:16202, wcc-node3:16203" </Object>   <Object name="weblogic" ppath="*/_dav/*"> Service fn="wl-proxy" WebLogicCluster="wcc-node1:16201,wcc-node2:16202, wcc-node3:16203" </Object>   <Object name="weblogic" ppath="*/_ocsh/*"> Service fn="wl-proxy" WebLogicCluster="wcc-node1:16201,wcc-node2:16202, wcc-node3:16203" </Object>   <Object name="weblogic" ppath="*/adfAuthentication/*"> Service fn="wl-proxy" WebLogicCluster="wcc-node1:16201,wcc-node2:16202, wcc-node3:16203" </Object> If you are using an single node setup, change the Service fn=…. line to something like: Service fn="wl-proxy" WebLogicHost=<wcc-server> WebLogicPort=16200 With these configurations, your should have the WebCenter Content UI working with the iPlanet, test it. [http://<web-server>/cs/] With the UI working, the last step is to configure the WebDav: - Go to the iPlanet Admin Console (usually https://<web-server>:8989) - Go to Configurations >> [instance] >> Virtual Servers >> [Virtual Server] >> WebDAV: - Click New - Populate the URI with /cs/idcplg/webdav: - Select “Anyone (No Authentication)”, the wc Content will take care of the security: This will allow you to use the WebDav feature and the Desktop Integration Suite, including double-byte characters. Anothers iPlanet tunes could be done, I can cover in the next post related to the iPlanet. Cross-posted on the ContentrA.com Blog Related posts:  - Using a Web Proxy Server with WebCenter Family

    Read the article

  • Conférences Java et Web le vendredi 9 juillet à Sophia Antipolis co-organisées par le RivieraJUG et

    Bonsoir, Dans le cadre des 10 jours de la SophiaConf, le RivieraJUG ainsi que l'Open Coffee Club Sophia et le Bar Camp Sophia vous concoctent une journée complète de conférences sur les thèmes Java et Web, ponctuée par un BarCamp. Ce seront pas moins de 12 présentations, sur deux tracks parallèles qui se dérouleront de 9h à 17h30 : Le track Java :JDK 7, par Simon Ritter Play! Framework, par Nicolas Leroux Hibernate Search. Trouver les données, vous valez mieux ...

    Read the article

< Previous Page | 79 80 81 82 83 84 85 86 87 88 89 90  | Next Page >