Search Results

Search found 14828 results on 594 pages for 'settings py'.

Page 262/594 | < Previous Page | 258 259 260 261 262 263 264 265 266 267 268 269  | Next Page >

  • SmtpClient and Locked File Attachments

    - by Rick Strahl
    Got a note a couple of days ago from a client using one of my generic routines that wraps SmtpClient. Apparently whenever a file has been attached to a message and emailed with SmtpClient the file remains locked after the message has been sent. Oddly this particular issue hasn’t cropped up before for me although these routines are in use in a number of applications I’ve built. The wrapper I use was built mainly to backfit an old pre-.NET 2.0 email client I built using Sockets to avoid the CDO nightmares of the .NET 1.x mail client. The current class retained the same class interface but now internally uses SmtpClient which holds a flat property interface that makes it less verbose to send off email messages. File attachments in this interface are handled by providing a comma delimited list for files in an Attachments string property which is then collected along with the other flat property settings and eventually passed on to SmtpClient in the form of a MailMessage structure. The jist of the code is something like this: /// <summary> /// Fully self contained mail sending method. Sends an email message by connecting /// and disconnecting from the email server. /// </summary> /// <returns>true or false</returns> public bool SendMail() { if (!this.Connect()) return false; try { // Create and configure the message MailMessage msg = this.GetMessage(); smtp.Send(msg); this.OnSendComplete(this); } catch (Exception ex) { string msg = ex.Message; if (ex.InnerException != null) msg = ex.InnerException.Message; this.SetError(msg); this.OnSendError(this); return false; } finally { // close connection and clear out headers // SmtpClient instance nulled out this.Close(); } return true; } /// <summary> /// Configures the message interface /// </summary> /// <param name="msg"></param> protected virtual MailMessage GetMessage() { MailMessage msg = new MailMessage(); msg.Body = this.Message; msg.Subject = this.Subject; msg.From = new MailAddress(this.SenderEmail, this.SenderName); if (!string.IsNullOrEmpty(this.ReplyTo)) msg.ReplyTo = new MailAddress(this.ReplyTo); // Send all the different recipients this.AssignMailAddresses(msg.To, this.Recipient); this.AssignMailAddresses(msg.CC, this.CC); this.AssignMailAddresses(msg.Bcc, this.BCC); if (!string.IsNullOrEmpty(this.Attachments)) { string[] files = this.Attachments.Split(new char[2] { ',', ';' }, StringSplitOptions.RemoveEmptyEntries); foreach (string file in files) { msg.Attachments.Add(new Attachment(file)); } } if (this.ContentType.StartsWith("text/html")) msg.IsBodyHtml = true; else msg.IsBodyHtml = false; msg.BodyEncoding = this.Encoding; … additional code omitted return msg; } Basically this code collects all the property settings of the wrapper object and applies them to the SmtpClient and in GetMessage() to an individual MailMessage properties. Specifically notice that attachment filenames are converted from a comma-delimited string to filenames from which new attachments are created. The code as it’s written however, will cause the problem with file attachments not being released properly. Internally .NET opens up stream handles and reads the files from disk to dump them into the email send stream. The attachments are always sent correctly but the local files are not immediately closed. As you probably guessed the issue is simply that some resources are not automatcially disposed when sending is complete and sure enough the following code change fixes the problem: // Create and configure the message using (MailMessage msg = this.GetMessage()) { smtp.Send(msg); if (this.SendComplete != null) this.OnSendComplete(this); // or use an explicit msg.Dispose() here } The Message object requires an explicit call to Dispose() (or a using() block as I have here) to force the attachment files to get closed. I think this is rather odd behavior for this scenario however. The code I use passes in filenames and my expectation of an API that accepts file names is that it uses the files by opening and streaming them and then closing them when done. Why keep the streams open and require an explicit .Dispose() by the calling code which is bound to lead to unexpected behavior just as my customer ran into? Any API level code should clean up as much as possible and this is clearly not happening here resulting in unexpected behavior. Apparently lots of other folks have run into this before as I found based on a few Twitter comments on this topic. Odd to me too is that SmtpClient() doesn’t implement IDisposable – it’s only the MailMessage (and Attachments) that implement it and require it to clean up for left over resources like open file handles. This means that you couldn’t even use a using() statement around the SmtpClient code to resolve this – instead you’d have to wrap it around the message object which again is rather unexpected. Well, chalk that one up to another small unexpected behavior that wasted a half an hour of my time – hopefully this post will help someone avoid this same half an hour of hunting and searching. Resources: Full code to SmptClientNative (West Wind Web Toolkit Repository) SmtpClient Documentation MSDN © Rick Strahl, West Wind Technologies, 2005-2010Posted in .NET  

    Read the article

  • HostGator hosting account and DNS servers

    - by fxuser
    I have a hosting package on HostGator on domain example.com I also got a VPS server from Digital Ocean and have setup the DNS Details on the server there (A Record pointing on DO server IP with @ as hostname) and have also setup the DNS Servers on my domain DNS Settings which is hosted on HostGator. All seem okay right now... It's been 3-4 hours I think till I made these changes and when I point to example.com I still get on the active hosting package files instead of the server files. Do I need to cancel the hosting package first before I can make this work? EDIT: If I'm in the wrong site please move it to the correct one.

    Read the article

  • Setting up a local AI server - easy with Solaris 11

    - by Stefan Hinker
    Many things are new in Solaris 11, Autoinstall is one of them.  If, like me, you've known Jumpstart for the last 2 centuries or so, you'll have to start from scratch.  Well, almost, as the concepts are similar, and it's not all that difficult.  Just new. I wanted to have an AI server that I could use for demo purposes, on the train if need be.  That answers the question of hardware requirements: portable.  But let's start at the beginning. First, you need an OS image, of course.  In the new world of Solaris 11, it is now called a repository.  The original can be downloaded from the Solaris 11 page at Oracle.   What you want is the "Oracle Solaris 11 11/11 Repository Image", which comes in two parts that can be combined using cat.  MD5 checksums for these (and all other downloads from that page) are available closer to the top of the page. With that, building the repository is quick and simple: # zfs create -o mountpoint=/export/repo rpool/ai/repo # zfs create rpool/ai/repo/s11 # mount -o ro -F hsfs /tmp/sol-11-1111-repo-full.iso /mnt # rsync -aP /mnt/repo /export/repo/s11 # umount /mnt # pkgrepo rebuild -s /export/repo/sol11/repo # zfs snapshot rpool/ai/repo/sol11@fcs # pkgrepo info -s /export/repo/sol11/repo PUBLISHER PACKAGES STATUS UPDATED solaris 4292 online 2012-03-12T20:47:15.378639Z That's all there's to it.  Let's make a snapshot, just to be on the safe side.  You never know when one will come in handy.  To use this repository, you could just add it as a file-based publisher: # pkg set-publisher -g file:///export/repo/sol11/repo solaris In case I'd want to access this repository through a (virtual) network, i'll now quickly activate the repository-service: # svccfg -s application/pkg/server \ setprop pkg/inst_root=/export/repo/sol11/repo # svccfg -s application/pkg/server setprop pkg/readonly=true # svcadm refresh application/pkg/server # svcadm enable application/pkg/server That's all you need - now point your browser to http://localhost/ to view your beautiful repository-server. Step 1 is done.  All of this, by the way, is nicely documented in the README file that's contained in the repository image. Of course, we already have updates to the original release.  You can find them in MOS in the Oracle Solaris 11 Support Repository Updates (SRU) Index.  You can simply add these to your existing repository or create separate repositories for each SRU.  The individual SRUs are self-sufficient and incremental - SRU4 includes all updates from SRU2 and SRU3.  With ZFS, you can also get both: A full repository with all updates and at the same time incremental ones up to each of the updates: # mount -o ro -F hsfs /tmp/sol-11-1111-sru4-05-incr-repo.iso /mnt # pkgrecv -s /mnt/repo -d /export/repo/sol11/repo '*' # umount /mnt # pkgrepo rebuild -s /export/repo/sol11/repo # zfs snapshot rpool/ai/repo/sol11@sru4 # zfs set snapdir=visible rpool/ai/repo/sol11 # svcadm restart svc:/application/pkg/server:default The normal repository is now updated to SRU4.  Thanks to the ZFS snapshots, there is also a valid repository of Solaris 11 11/11 without the update located at /export/repo/sol11/.zfs/snapshot/fcs . If you like, you can also create another repository service for each update, running on a separate port. But now lets continue with the AI server.  Just a little bit of reading in the dokumentation makes it clear that we will need to run a DHCP server for this.  Since I already have one active (for my SunRay installation) and since it's a good idea to have these kinds of services separate anyway, I decided to create this in a Zone.  So, let's create one first: # zfs create -o mountpoint=/export/install rpool/ai/install # zfs create -o mountpoint=/zones rpool/zones # zonecfg -z ai-server zonecfg:ai-server> create create: Using system default template 'SYSdefault' zonecfg:ai-server> set zonepath=/zones/ai-server zonecfg:ai-server> add dataset zonecfg:ai-server:dataset> set name=rpool/ai/install zonecfg:ai-server:dataset> set alias=install zonecfg:ai-server:dataset> end zonecfg:ai-server> commit zonecfg:ai-server> exit # zoneadm -z ai-server install # zoneadm -z ai-server boot ; zlogin -C ai-server Give it a hostname and IP address at first boot, and there's the Zone.  For a publisher for Solaris packages, it will be bound to the "System Publisher" from the Global Zone.  The /export/install filesystem, of course, is intended to be used by the AI server.  Let's configure it now: #zlogin ai-server root@ai-server:~# pkg install install/installadm root@ai-server:~# installadm create-service -n x86-fcs -a i386 \ -s pkg://solaris/install-image/[email protected],5.11-0.175.0.0.0.2.1482 \ -d /export/install/fcs -i 192.168.2.20 -c 3 With that, the core AI server is already done.  What happened here?  First, I installed the AI server software.  IPS makes that nice and easy.  If necessary, it'll also pull in the required DHCP-Server and anything else that might be missing.  Watch out for that DHCP server software.  In Solaris 11, there are two different versions.  There's the one you might know from Solaris 10 and earlier, and then there's a new one from ISC.  The latter is the one we need for AI.  The SMF service names of both are very similar.  The "old" one is "svc:/network/dhcp-server:default". The ISC-server comes with several SMF-services. We at least need "svc:/network/dhcp/server:ipv4".  The command "installadm create-service" creates the installation-service. It's called "x86-fcs", serves the "i386" architecture and gets its boot image from the repository of the system publisher, using version 5.11,5.11-0.175.0.0.0.2.1482, which is Solaris 11 11/11.  (The option "-a i386" in this example is optional, since the installserver itself runs on a x86 machine.) The boot-environment for clients is created in /export/install/fcs and the DHCP-server is configured for 3 IP-addresses starting at 192.168.2.20.  This configuration is stored in a very human readable form in /etc/inet/dhcpd4.conf.  An AI-service for SPARC systems could be created in the very same way, using "-a sparc" as the architecture option. Now we would be ready to register and install the first client.  It would be installed with the default "solaris-large-server" using the publisher "http://pkg.oracle.com/solaris/release" and would query it's configuration interactively at first boot.  This makes it very clear that an AI-server is really only a boot-server.  The true source of packets to install can be different.  Since I don't like these defaults for my demo setup, I did some extra config work for my clients. The configuration of a client is controlled by manifests and profiles.  The manifest controls which packets are installed and how the filesystems are layed out.  In that, it's very much like the old "rules.ok" file in Jumpstart.  Profiles contain additional configuration like root passwords, primary user account, IP addresses, keyboard layout etc.  Hence, profiles are very similar to the old sysid.cfg file. The easiest way to get your hands on a manifest is to ask the AI server we just created to give us it's default one.  Then modify that to our liking and give it back to the installserver to use: root@ai-server:~# mkdir -p /export/install/configs/manifests root@ai-server:~# cd /export/install/configs/manifests root@ai-server:~# installadm export -n x86-fcs -m orig_default \ -o orig_default.xml root@ai-server:~# cp orig_default.xml s11-fcs.small.local.xml root@ai-server:~# vi s11-fcs.small.local.xml root@ai-server:~# more s11-fcs.small.local.xml <!DOCTYPE auto_install SYSTEM "file:///usr/share/install/ai.dtd.1"> <auto_install> <ai_instance name="S11 Small fcs local"> <target> <logical> <zpool name="rpool" is_root="true"> <filesystem name="export" mountpoint="/export"/> <filesystem name="export/home"/> <be name="solaris"/> </zpool> </logical> </target> <software type="IPS"> <destination>  </destination> <source> <publisher name="solaris"> <origin name="http://192.168.2.12/"/> </publisher> </source> <!-- By default the latest build available, in the specified IPS repository, is installed. If another build is required, the build number has to be appended to the 'entire' package in the following form: <name>pkg:/[email protected]#</name> --> <software_data action="install"> <name>pkg:/[email protected],5.11-0.175.0.0.0.2.0</name> <name>pkg:/group/system/solaris-small-server</name> </software_data> </software> </ai_instance> </auto_install> root@ai-server:~# installadm create-manifest -n x86-fcs -d \ -f ./s11-fcs.small.local.xml root@ai-server:~# installadm list -m -n x86-fcs Manifest Status Criteria -------- ------ -------- S11 Small fcs local Default None orig_default Inactive None The major points in this new manifest are: Install "solaris-small-server" Install a few locales less than the default.  I'm not that fluid in French or Japanese... Use my own package service as publisher, running on IP address 192.168.2.12 Install the initial release of Solaris 11:  pkg:/[email protected],5.11-0.175.0.0.0.2.0 Using a similar approach, I'll create a default profile interactively and use it as a template for a few customized building blocks, each defining a part of the overall system configuration.  The modular approach makes it easy to configure numerous clients later on: root@ai-server:~# mkdir -p /export/install/configs/profiles root@ai-server:~# cd /export/install/configs/profiles root@ai-server:~# sysconfig create-profile -o default.xml root@ai-server:~# cp default.xml general.xml; cp default.xml mars.xml root@ai-server:~# cp default.xml user.xml root@ai-server:~# vi general.xml mars.xml user.xml root@ai-server:~# more general.xml mars.xml user.xml :::::::::::::: general.xml :::::::::::::: <!DOCTYPE service_bundle SYSTEM "/usr/share/lib/xml/dtd/service_bundle.dtd.1"> <service_bundle type="profile" name="sysconfig"> <service version="1" type="service" name="system/timezone"> <instance enabled="true" name="default"> <property_group type="application" name="timezone"> <propval type="astring" name="localtime" value="Europe/Berlin"/> </property_group> </instance> </service> <service version="1" type="service" name="system/environment"> <instance enabled="true" name="init"> <property_group type="application" name="environment"> <propval type="astring" name="LANG" value="C"/> </property_group> </instance> </service> <service version="1" type="service" name="system/keymap"> <instance enabled="true" name="default"> <property_group type="system" name="keymap"> <propval type="astring" name="layout" value="US-English"/> </property_group> </instance> </service> <service version="1" type="service" name="system/console-login"> <instance enabled="true" name="default"> <property_group type="application" name="ttymon"> <propval type="astring" name="terminal_type" value="vt100"/> </property_group> </instance> </service> <service version="1" type="service" name="network/physical"> <instance enabled="true" name="default"> <property_group type="application" name="netcfg"> <propval type="astring" name="active_ncp" value="DefaultFixed"/> </property_group> </instance> </service> <service version="1" type="service" name="system/name-service/switch"> <property_group type="application" name="config"> <propval type="astring" name="default" value="files"/> <propval type="astring" name="host" value="files dns"/> <propval type="astring" name="printer" value="user files"/> </property_group> <instance enabled="true" name="default"/> </service> <service version="1" type="service" name="system/name-service/cache"> <instance enabled="true" name="default"/> </service> <service version="1" type="service" name="network/dns/client"> <property_group type="application" name="config"> <property type="net_address" name="nameserver"> <net_address_list> <value_node value="192.168.2.1"/> </net_address_list> </property> </property_group> <instance enabled="true" name="default"/> </service> </service_bundle> :::::::::::::: mars.xml :::::::::::::: <!DOCTYPE service_bundle SYSTEM "/usr/share/lib/xml/dtd/service_bundle.dtd.1"> <service_bundle type="profile" name="sysconfig"> <service version="1" type="service" name="network/install"> <instance enabled="true" name="default"> <property_group type="application" name="install_ipv4_interface"> <propval type="astring" name="address_type" value="static"/> <propval type="net_address_v4" name="static_address" value="192.168.2.100/24"/> <propval type="astring" name="name" value="net0/v4"/> <propval type="net_address_v4" name="default_route" value="192.168.2.1"/> </property_group> <property_group type="application" name="install_ipv6_interface"> <propval type="astring" name="stateful" value="yes"/> <propval type="astring" name="stateless" value="yes"/> <propval type="astring" name="address_type" value="addrconf"/> <propval type="astring" name="name" value="net0/v6"/> </property_group> </instance> </service> <service version="1" type="service" name="system/identity"> <instance enabled="true" name="node"> <property_group type="application" name="config"> <propval type="astring" name="nodename" value="mars"/> </property_group> </instance> </service> </service_bundle> :::::::::::::: user.xml :::::::::::::: <!DOCTYPE service_bundle SYSTEM "/usr/share/lib/xml/dtd/service_bundle.dtd.1"> <service_bundle type="profile" name="sysconfig"> <service version="1" type="service" name="system/config-user"> <instance enabled="true" name="default"> <property_group type="application" name="root_account"> <propval type="astring" name="login" value="root"/> <propval type="astring" name="password" value="noIWillNotTellYouMyPasswordNotEvenEncrypted"/> <propval type="astring" name="type" value="role"/> </property_group> <property_group type="application" name="user_account"> <propval type="astring" name="login" value="stefan"/> <propval type="astring" name="password" value="noIWillNotTellYouMyPasswordNotEvenEncrypted"/> <propval type="astring" name="type" value="normal"/> <propval type="astring" name="description" value="Stefan Hinker"/> <propval type="count" name="uid" value="12345"/> <propval type="count" name="gid" value="10"/> <propval type="astring" name="shell" value="/usr/bin/bash"/> <propval type="astring" name="roles" value="root"/> <propval type="astring" name="profiles" value="System Administrator"/> <propval type="astring" name="sudoers" value="ALL=(ALL) ALL"/> </property_group> </instance> </service> </service_bundle> root@ai-server:~# installadm create-profile -n x86-fcs -f general.xml root@ai-server:~# installadm create-profile -n x86-fcs -f user.xml root@ai-server:~# installadm create-profile -n x86-fcs -f mars.xml \ -c ipv4=192.168.2.100 root@ai-server:~# installadm list -p Service Name Profile ------------ ------- x86-fcs general.xml mars.xml user.xml root@ai-server:~# installadm list -n x86-fcs -p Profile Criteria ------- -------- general.xml None mars.xml ipv4 = 192.168.2.100 user.xml None Here's the idea behind these files: "general.xml" contains settings valid for all my clients.  Stuff like DNS servers, for example, which in my case will always be the same. "user.xml" only contains user definitions.  That is, a root password and a primary user.Both of these profiles will be valid for all clients (for now). "mars.xml" defines network settings for an individual client.  This profile is associated with an IP-Address.  For this to work, I'll have to tweak the DHCP-settings in the next step: root@ai-server:~# installadm create-client -e 08:00:27:AA:3D:B1 -n x86-fcs root@ai-server:~# vi /etc/inet/dhcpd4.conf root@ai-server:~# tail -5 /etc/inet/dhcpd4.conf host 080027AA3DB1 { hardware ethernet 08:00:27:AA:3D:B1; fixed-address 192.168.2.100; filename "01080027AA3DB1"; } This completes the client preparations.  I manually added the IP-Address for mars to /etc/inet/dhcpd4.conf.  This is needed for the "mars.xml" profile.  Disabling arbitrary DHCP-replies will shut up this DHCP server, making my life in a shared environment a lot more peaceful ;-)Now, I of course want this installation to be completely hands-off.  For this to work, I'll need to modify the grub boot menu for this client slightly.  You can find it in /etc/netboot.  "installadm create-client" will create a new boot menu for every client, identified by the client's MAC address.  The template for this can be found in a subdirectory with the name of the install service, /etc/netboot/x86-fcs in our case.  If you don't want to change this manually for every client, modify that template to your liking instead. root@ai-server:~# cd /etc/netboot root@ai-server:~# cp menu.lst.01080027AA3DB1 menu.lst.01080027AA3DB1.org root@ai-server:~# vi menu.lst.01080027AA3DB1 root@ai-server:~# diff menu.lst.01080027AA3DB1 menu.lst.01080027AA3DB1.org 1,2c1,2 < default=1 < timeout=10 --- > default=0 > timeout=30 root@ai-server:~# more menu.lst.01080027AA3DB1 default=1 timeout=10 min_mem64=0 title Oracle Solaris 11 11/11 Text Installer and command line kernel$ /x86-fcs/platform/i86pc/kernel/$ISADIR/unix -B install_media=htt p://$serverIP:5555//export/install/fcs,install_service=x86-fcs,install_svc_addre ss=$serverIP:5555 module$ /x86-fcs/platform/i86pc/$ISADIR/boot_archive title Oracle Solaris 11 11/11 Automated Install kernel$ /x86-fcs/platform/i86pc/kernel/$ISADIR/unix -B install=true,inst all_media=http://$serverIP:5555//export/install/fcs,install_service=x86-fcs,inst all_svc_address=$serverIP:5555,livemode=text module$ /x86-fcs/platform/i86pc/$ISADIR/boot_archive Now just boot the client off the network using PXE-boot.  For my demo purposes, that's a client from VirtualBox, of course.  That's all there's to it.  And despite the fact that this blog entry is a little longer - that wasn't that hard now, was it?

    Read the article

  • No sound after video card replaced (AMD Radeon HD 7770)

    - by Sean
    Issue: no sound System: Dual boot Windows 7 (sda) Ubuntu 12.04 (sdb) 2 harddrives Dell XPS 730 Video card: AMD Radeo HD 7770 Diamond Multimedia Sound card: Creative Labs SB X-Fi Additional info: My sound used to work. Then, my old video card (NVIDIA geforce 280) died. I bought and installed a new video card: Radeon HD 7770. After this, my sound no longer worked in ubuntu (Win7 audio still works). Everything else in ubuntu, such as video, works fine. I suspect it has something to do with the fact that the Radeon card includes sound capability. Problem Details: If I click on System Settings - Sound, the panel freezes and stops responding indefinitely. The sound volume icon at the top of the screen (by the clock) shows 3 dashes beside it "---", and an empty drop-down box shows if I click on it. (Possibly related to 1.) When I reboot my machine, I get the message: "gnome settings daemon not responding". I have to force the reboot. I reinstalled ubunbu (perserving my home directory) and the problem persists. Diagnostics info: Following procedure outlined here: https://help.ubuntu.com/community/SoundTroubleshooting The following is a list of terminal commands, and their output: $ aplay -l List of PLAYBACK Hardware Devices There is no listing beyond that, and the command freezes until I hit control-c $ lspci -v | grep -A7 -i "audio" 00:0f.1 Audio device: NVIDIA Corporation MCP55 High Definition Audio (rev a2) Subsystem: Dell Device 0224 Flags: bus master, 66MHz, fast devsel, latency 0, IRQ 23 Memory at dfff0000 (32-bit, non-prefetchable) [size=16K] Capabilities: <access denied> Kernel driver in use: snd_hda_intel Kernel modules: snd-hda-intel -- 01:00.1 Audio device: Advanced Micro Devices [AMD] nee ATI Device aab0 Subsystem: Diamond Multimedia Systems Device aab0 Flags: bus master, fast devsel, latency 0, IRQ 43 Memory at dfefc000 (64-bit, non-prefetchable) [size=16K] Capabilities: <access denied> Kernel driver in use: snd_hda_intel Kernel modules: snd-hda-intel -- 03:0a.0 Audio device: Creative Labs SB X-Fi Subsystem: Creative Labs Device 6002 Flags: bus master, medium devsel, latency 32, IRQ 18 Memory at dbff4000 (32-bit, non-prefetchable) [size=16K] Memory at dbc00000 (64-bit, non-prefetchable) [size=2M] Memory at d4000000 (64-bit, non-prefetchable) [size=64M] I/O ports at 8c00 [size=32] Capabilities: <access denied> Notice the Diamond Multimedia Systems Device - that seems to be my video card sound. My video card is Diamond multimedia. Also there's the weird NVIDIA device in there. That must either be a remnant of my now removed NVIDIA graphics card, or else some kind of on-board thing. Not sure which. $ killall pulseaudio This allows me to open system settings - sound. But the "Test Sound" button makes no sound And the output volume + mute controls are greyed / disabled at 0 volume. It also allows me to click on the sound control in the "task bar" (beside the clock), and a volume slider drops down, but it is disabled / greyed at 0 volume. $ find /lib/modules/uname -r | grep snd /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-88pm860x.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-tlv320aic3x.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8900.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8978.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-tlv320dac33.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm9090.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-sta32x.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-max98088.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-max9850.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-rt5631.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8903.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8580.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8523.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-max9877.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-ads117x.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8955.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8804.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-sgtl5000.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8750.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm2000.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-tlv320aic32x4.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-ak4642.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-ad193x.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8753.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-ak4535.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8985.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8350.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-dfbmcs320.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-cs42l51.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-tlv320aic26.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8737.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-uda1380.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8776.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8995.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-tpa6130a2.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8727.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm5100.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8991.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8510.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-jz4740-codec.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8400.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-lm4857.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8960.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-alc5623.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-cs4270.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-tlv320aic23.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8993.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8961.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8940.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-uda134x.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-ad1836.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8994.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8782.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-cs4271.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8974.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8983.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8962.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-ak4641.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm-hubs.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8971.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8996.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wl1273.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-adav80x.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-spdif.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-pcm3008.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-cx20442.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-ak4671.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8711.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-ad73311.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-max98095.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm9081.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8741.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm1250-ev1.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8988.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-adau1373.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8731.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-l3.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-ssm2602.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-da7210.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-ak4104.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8904.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8728.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8770.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/codecs/snd-soc-wm8990.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/soc/snd-soc-core.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/synth/emux/snd-emux-synth.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/synth/snd-util-mem.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/core/snd-hrtimer.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/core/snd-hwdep.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/core/snd-pcm.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/core/snd-rawmidi.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/core/oss/snd-mixer-oss.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/core/snd-page-alloc.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/core/seq/snd-seq-midi.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/core/seq/snd-seq-dummy.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/core/seq/snd-seq-virmidi.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/core/seq/snd-seq-device.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/core/seq/snd-seq-midi-event.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/core/seq/snd-seq.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/core/seq/snd-seq-midi-emul.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/core/snd.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/core/snd-timer.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pcmcia/pdaudiocf/snd-pdaudiocf.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pcmcia/vx/snd-vxpocket.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/usb/6fire/snd-usb-6fire.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/usb/snd-usbmidi-lib.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/usb/caiaq/snd-usb-caiaq.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/usb/usx2y/snd-usb-usx2y.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/usb/usx2y/snd-usb-us122l.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/usb/snd-usb-audio.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/usb/misc/snd-ua101.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/drivers/opl3/snd-opl3-synth.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/drivers/opl3/snd-opl3-lib.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/drivers/opl4/snd-opl4-lib.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/drivers/opl4/snd-opl4-synth.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/drivers/snd-portman2x4.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/drivers/snd-serial-u16550.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/drivers/snd-mts64.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/drivers/snd-mtpav.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/drivers/mpu401/snd-mpu401.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/drivers/mpu401/snd-mpu401-uart.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/drivers/vx/snd-vx-lib.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/drivers/snd-dummy.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/drivers/snd-aloop.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/drivers/pcsp/snd-pcsp.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/drivers/snd-virmidi.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/firewire/snd-firewire-lib.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/firewire/snd-firewire-speakers.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/firewire/snd-isight.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/i2c/snd-tea6330t.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/i2c/other/snd-tea575x-tuner.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/i2c/other/snd-ak4113.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/i2c/other/snd-pt2258.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/i2c/other/snd-ak4117.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/i2c/other/snd-ak4xxx-adda.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/i2c/other/snd-ak4114.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/i2c/snd-cs8427.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/i2c/snd-i2c.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/emu10k1/snd-emu10k1-synth.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/emu10k1/snd-emu10k1.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/emu10k1/snd-emu10k1x.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/korg1212/snd-korg1212.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/au88x0/snd-au8830.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/au88x0/snd-au8820.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/au88x0/snd-au8810.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/aw2/snd-aw2.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-sis7019.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-ens1371.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/vx222/snd-vx222.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-via82xx.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-es1968.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-atiixp-modem.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-cs4281.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-sonicvibes.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-intel8x0.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-maestro3.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/ac97/snd-ac97-codec.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-es1938.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-fm801.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/nm256/snd-nm256.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/hda/snd-hda-codec-realtek.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/hda/snd-hda-codec-cmedia.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/hda/snd-hda-codec-conexant.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/hda/snd-hda-intel.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/hda/snd-hda-codec-analog.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/hda/snd-hda-codec-hdmi.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/hda/snd-hda-codec-idt.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/hda/snd-hda-codec-ca0110.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/hda/snd-hda-codec-cirrus.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/hda/snd-hda-codec-via.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/hda/snd-hda-codec-ca0132.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/hda/snd-hda-codec-si3054.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/hda/snd-hda-codec.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/riptide/snd-riptide.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-ens1370.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-als4000.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-intel8x0m.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/ca0106/snd-ca0106.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-cs5530.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/cs5535audio/snd-cs5535audio.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-rme32.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/ymfpci/snd-ymfpci.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/ctxfi/snd-ctxfi.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-azt3328.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/cs46xx/snd-cs46xx.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/lx6464es/snd-lx6464es.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/ice1712/snd-ice1712.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/ice1712/snd-ice17xx-ak4xxx.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/ice1712/snd-ice1724.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/mixart/snd-mixart.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/ali5451/snd-ali5451.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/lola/snd-lola.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/oxygen/snd-oxygen-lib.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/oxygen/snd-oxygen.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/oxygen/snd-virtuoso.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-via82xx-modem.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/pcxhr/snd-pcxhr.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/echoaudio/snd-indigo.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/echoaudio/snd-echo3g.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/echoaudio/snd-mona.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/echoaudio/snd-layla20.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/echoaudio/snd-gina20.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/echoaudio/snd-layla24.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/echoaudio/snd-mia.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/echoaudio/snd-indigoiox.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/echoaudio/snd-darla24.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/echoaudio/snd-indigoio.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/echoaudio/snd-indigodjx.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/echoaudio/snd-gina24.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/echoaudio/snd-darla20.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/echoaudio/snd-indigodj.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-cmipci.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/asihpi/snd-asihpi.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-ad1889.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/rme9652/snd-rme9652.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/rme9652/snd-hdspm.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/rme9652/snd-hdsp.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/trident/snd-trident.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-atiixp.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-als300.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-bt87x.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/pci/snd-rme96.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/opti9xx/snd-miro.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/opti9xx/snd-opti92x-ad1848.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/opti9xx/snd-opti93x.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/opti9xx/snd-opti92x-cs4231.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/gus/snd-gusextreme.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/gus/snd-interwave.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/gus/snd-gusmax.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/gus/snd-interwave-stb.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/gus/snd-gus-lib.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/gus/snd-gusclassic.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/sb/snd-emu8000-synth.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/sb/snd-sb16-dsp.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/sb/snd-sbawe.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/sb/snd-sb8-dsp.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/sb/snd-sb-common.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/sb/snd-sb16.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/sb/snd-sb16-csp.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/sb/snd-sb8.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/sb/snd-jazz16.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/snd-es18xx.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/snd-azt2320.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/snd-cmi8330.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/snd-als100.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/msnd /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/msnd/snd-msnd-classic.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/msnd/snd-msnd-pinnacle.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/msnd/snd-msnd-lib.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/cs423x/snd-cs4231.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/cs423x/snd-cs4236.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/es1688/snd-es1688-lib.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/es1688/snd-es1688.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/snd-adlib.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/ad1848/snd-ad1848.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/ad1816a/snd-ad1816a.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/galaxy/snd-azt1605.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/galaxy/snd-azt2316.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/wavefront/snd-wavefront.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/wss/snd-wss-lib.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/snd-sc6000.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/snd-sscape.ko /lib/modules/3.2.0-29-generic-pae/kernel/sound/isa/snd-opl3sa2.ko

    Read the article

  • Lessons from a SAN Failure

    - by Bill Graziano
    At 1:10AM Sunday morning the main SAN at one of my clients suffered a “partial” failure.  Partial means that the SAN was still online and functioning but the LUNs attached to our two main SQL Servers “failed”.  Failed means that SQL Server wouldn’t start and the MDF and LDF files mostly showed a zero file size.  But they were online and responding and most other LUNs were available.  I’m not sure how SANs know to fail at 1AM on a Saturday night but they seem to.  From a personal standpoint this worked out poorly: I was out with friends and after more than a few drinks.  From a work standpoint this was about the best time to fail you could imagine.  Everything was running well before Monday morning.  But it was a long, long Sunday.  I started tipsy, got tired and ended up hung over later in the day. Note to self: Try not to go out drinking right before the SAN fails. This caught us at an interesting time.  We’re in the process of migrating to an entirely new set of servers so some things were partially moved.  This made it difficult to follow our procedures as cleanly as we’d like.  The benefit was that we had much better documentation of everything on the server.  I would encourage everyone to really think through the process of implementing your DR plan and document as much as possible.  Following a checklist is much easier than trying to remember at night under pressure in a hurry after a few drinks. I had a series of estimates on how long things would take.  They were accurate for any single server failure.  They weren’t accurate for a SAN failure that took two servers down.  This wasn’t bad but we should have communicated better. Don’t forget how many things are outside the database.  Logins, linked servers, DTS packages (yikes!), jobs, service broker, DTC (especially DTC), database triggers and any objects in the master database are all things you need backed up.  We’d done a decent job on this and didn’t find significant problems here.  That said this still took a lot of time.  There were many annoyances as a result of this.  Small settings like a login’s default database had a big impact on whether an application could run.  This is probably the single biggest area of concern when looking to recreate a server.  I’d encourage everyone to go through every single node of SSMS and look for user created objects or settings outside the database. Script out your logins with the proper SID and already encrypted passwords and keep it updated.  This makes life so much easier.  I used an approach based on KB246133 that worked well.  I’ll get my scripts posted over the next few days. The disaster can cause your DR process to fail in unexpected ways.  We have a job that scripts out all logins and role memberships and writes it to a file.  This runs on the DR server and pulls from the production server.  Upon opening the file I found that the contents were a “server not found” error.  Fortunately we had other copies and didn’t need to try and restore the master database.  This now runs on the production server and pushes the script to the DR site.  Soon we’ll get it pushed to our version control software. One of the biggest challenges is keeping your DR resources up to date.  Any server change (new linked server, new SQL Server Agent job, etc.) means that your DR plan (and scripts) is out of date.  It helps to automate the generation of these resources if possible. Take time now to test your database restore process.  We test ours quarterly.  If you have a large database I’d also encourage you to invest in a compressed backup solution.  Restoring backups was the single larger consumer of time during our recovery. And yes, there’s a database mirroring solution planned in our new architecture. I didn’t have much involvement in things outside SQL Server but this caused many, many things to change in our environment.  Many applications today aren’t just executables or web sites.  They are a combination of those plus network infrastructure, reports, network ports, IP addresses, DTS and SSIS packages, batch systems and many other things.  These all needed a little bit of attention to make sure they were functioning properly. Profiler turned out to be a handy tool.  I started a trace for failed logins and kept that running.  That let me fix a number of problems before people were able to report them.  I also ran traces to capture exceptions.  This helped identify problems with linked servers. Overall the thing that gave me the most problem was linked servers.  In order for a linked server to function properly you need to be pointed to the right server, have the proper login information, have the network routes available and have MSDTC configured properly.  We have a lot of linked servers and this created many failure points.  Some of the older linked servers used IP addresses and not DNS names.  This meant we had to go in and touch all those linked servers when the servers moved.

    Read the article

  • linking Google AdWords account to Google Analytics account

    - by crmpicco
    I have a Google Analytics account that has two profiles, one for www.ayrshireminis.com and one for www.crmpicco.co.uk. I have a Google AdWords account that I would like to link to my Google Analytics account, but for some reason the Google AdWords admin is telling me I cannot do that. Within the AdWords admin and the My Account Linked Accounts Google Analytics section both profiles show as Not Available ... it also has this message... None of your profiles are available for linking due to your account settings. How can I link these two accounts?

    Read the article

  • Visual Studio 2010 Service Pack 1 And .NET Framework 4.0 Update

    - by Paulo Morgado
    As announced by Jason Zender in his blog post, Visual Studio 2010 Service Pack 1 is available for download for MSDN subscribers since March 8 and is available to the general public since March 10. Brian Harry provides information related to TFS and S. "Soma" Somasegar provides information on the latest Visual Studio 2010 enhancements. With this service pack for Visual Studio an update to the .NET Framework 4.0 is also released. For detailed information about these releases, please refer to the corresponding KB articles: Update for Microsoft .NET Framework 4 Description of Visual Studio 2010 Service Pack 1 Update: When I was upgrading from the Beta to the final release on Windows 7 Enterprise 64bit, the instalation hanged with Returning IDCANCEL. INSTALLMESSAGE_WARNING [Warning 1946.Property 'System.AppUserModel.ExcludeFromShowInNewInstall' for shortcut 'Manage Help Settings - ENU.lnk' could not be set.]. Canceling the installation didn’t work and I had to kill the setup.exe process. When reapplying it again, rollbacks were reported, so I reapplied it again – this time with succes.

    Read the article

  • Samba folder is gone

    - by bioShark
    I seem to have some issues sharing folders from my Ubuntu 12.04 machine to a Win7 machine. After playing around with the settings, I decided to revert to Samba's original setting by reinstalling it: sudo apt-get purge samba sudo rm -rf /etc/samba/ /etc/default/samba sudo apt-get install samba just to be sure I also run: sudo apt-get install samba samba-common system-config-samba winbind Now, I can't find /etc/samba folder any more. Even when I try to share a folder through Nautilus, I get: Samba's testparm returned error 1: Load smb config files from /etc/samba/smb.conf rlimit_max: increasing rlimit_max (1024) to minimum Windows limit (16384) params.c:OpenConfFile() - Unable to open configuration file "/etc/samba/smb.conf": No such file or directory Error loading services. Same when I try to list: xxx@xxx:~$ ll /etc/samba ls: cannot access /etc/samba: No such file or directory Any ideas what I did wrong, or what other package am I missing? cheers

    Read the article

  • Integrate BING API for Search inside ASP.Net web application

    - by sreejukg
    As you might already know, Bing is the Microsoft Search engine and is getting popular day by day. Bing offers APIs that can be integrated into your website to increase your website functionality. At this moment, there are two important APIs available. They are Bing Search API Bing Maps The Search API enables you to build applications that utilize Bing’s technology. The API allows you to search multiple source types such as web; images, video etc. and supports various output prototypes such as JSON, XML, and SOAP. Also you will be able to customize the search results as you wish for your public facing website. Bing Maps API allows you to build robust applications that use Bing Maps. In this article I am going to describe, how you can integrate Bing search into your website. In order to start using Bing, First you need to sign in to http://www.bing.com/toolbox/bingdeveloper/ using your windows live credentials. Click on the Sign in button, you will be asked to enter your windows live credentials. Once signed in you will be redirected to the Developer page. Here you can create applications and get AppID for each application. Since I am a first time user, I don’t have any applications added. Click on the Add button to add a new application. You will be asked to enter certain details about your application. The fields are straight forward, only thing you need to note is the website field, here you need to enter the website address from where you are going to use this application, and this field is optional too. Of course you need to agree on the terms and conditions and then click Save. Once you click on save, the application will be created and application ID will be available for your use. Now we got the APP Id. Basically Bing supports three protocols. They are JSON, XML and SOAP. JSON is useful if you want to call the search requests directly from the browser and use JavaScript to parse the results, thus JSON is the favorite choice for AJAX application. XML is the alternative for applications that does not support SOAP, e.g. flash/ Silverlight etc. SOAP is ideal for strongly typed languages and gives a request/response object model. In this article I am going to demonstrate how to search BING API using SOAP protocol from an ASP.Net application. For the purpose of this demonstration, I am going to create an ASP.Net project and implement the search functionality in an aspx page. Open Visual Studio, navigate to File-> New Project, select ASP.Net empty web application, I named the project as “BingSearchSample”. Add a Search.aspx page to the project, once added the solution explorer will looks similar to the following. Now you need to add a web reference to the SOAP service available from Bing. To do this, from the solution explorer, right click your project, select Add Service Reference. Now the new service reference dialog will appear. In the left bottom of the dialog, you can find advanced button, click on it. Now the service reference settings dialog will appear. In the bottom left, you can find Add Web Reference button, click on it. The add web reference dialog will appear now. Enter the URL as http://api.bing.net/search.wsdl?AppID=<YourAppIDHere>&version=2.2 (replace <yourAppIDHere> with the appID you have generated previously) and click on the button next to it. This will find the web service methods available. You can change the namespace suggested by Bing, but for the purpose of this demonstration I have accepted all the default settings. Click on the Add reference button once you are done. Now the web reference to Search service will be added your project. You can find this under solution explorer of your project. Now in the Search.aspx, that you previously created, place one textbox, button and a grid view. For the purpose of this demonstration, I have given the identifiers (ID) as txtSearch, btnSearch, gvSearch respectively. The idea is to search the text entered in the text box using Bing service and show the results in the grid view. In the design view, the search.aspx looks as follows. In the search.aspx.cs page, add a using statement that points to net.bing.api. I have added the following code for button click event handler. The code is very straight forward. It just calls the service with your AppID, a query to search and a source for searching. Let us run this page and see the output when I enter Microsoft in my textbox. If you want to search a specific site, you can include the site name in the query parameter. For e.g. the following query will search the word Microsoft from www.microsoft.com website. searchRequest.Query = “site:www.microsoft.com Microsoft”; The output of this query is as follows. Integrating BING search API to your website is easy and there is no limit on the customization of the interface you can do. There is no Bing branding required so I believe this is a great option for web developers when they plan for site search.

    Read the article

  • Block access to specific applications

    - by Jason Aren
    I would like to give several users rights to a computer running Ubuntu to do most administrative functions such as add/remove programs, save files, make settings changes, etc. However, I would like to block them from using several specific applications. Is this possible, and how would I do so? To provide a bit more detail: I am trying to set up Gnome Nanny to block adult websites from my kids' computer. I'd like to give them full access to the computer ACCEPT for Gnome Nanny. Windows has a program called K9 that cannot be turned off or uninstalled unless the user has the password EVEN if the user is an admin. Sounds like this isn't available on Ubuntu without a rather involved process of setting permissions on a large list of applications and functions to mimic admin rights.

    Read the article

  • How to create a new Team Project Collection in TFS2010:

    - by jehan
    TFS 2010 has introduced the notion of Team Project Collection (TPC).  I have already discussed about TPC in my earlier post, you can check it out here. In this post, I will demonstrate how to create a new Team Project Collection in TFS2010. First, you have to open the TFS Administration Console (Start à All Programs à Microsoft Team Foundation Server 2010 à Team Foundation Server Administration Console), expand the Application Tier node in TFS Administration Console and click on Team Project Collection. Here you will see the TPC’s which are already exist, I am having only one TPC named New Collection and I’m going to create a new TPC called Demo Collection. To create a new Team Project Collection, you need to click on Create Collection; it will open the Create New Team Project Collection window.     Under the Name tab, you have to enter the name of Collection which you want to give for your new TPC (I naming it as Demo Collection). You can also provide some description about your TPC in Description tab which is optional and click next. Here, you need to enter the name of SQL Server Instance where you want your new TPC data to reside. You have the option either to choose the creating a Database for this TPC or use the already existing empty database and then click next.   In next screen, you have to choose SharePoint configuration. Here you have the options to either configure SharePoint Site for TPC at default collections or you can specify the your existing SharePoint site and  you can also choose not  to configure the SharePoint for this collection, if you choose last option then you cannot configure the Share Point sites for the all the Team Projects under this Project Collection. You also have the flexibility to create a Share Point site for this TPC later on, then if you need you have to configure SharePoint site for the existing team projects manually.   In next screen, you will have the Reports configuration. Here you have the options to either configure the Reports for TPC at default path or you can specify the path for at existing Reports folder, you can also choose not to configure the Reports for this collection, if you choose last option then you cannot create  the Reports  for the all the Team Projects under this Project Collection. Here also you can enable reporting for this TPC later on. The next screen is related to Lab Management Configuration, Lab Management is the new feature in TFS2010 which enables the users to create and manage virtual test environments where you can deploy and test your application. There are no options available here as I don’t have the Lab Management configured for my Team Foundation Server. The next screen is Review Configuration window, which will show up all the configuration settings you have specified, so that you can review the configurations before creating the Team Project Collection. If you want to make any changes to the configurations then you can go back to the previous windows and can make the changes. After Reviewing the configuration settings, you can click on verify button. Which will verify that if you’re Team Project Collection is ready to be created or not, it will show up the errors and warning (if any) which can make your Team Project Collection fail. You can then choose to create the Team Project Collection if the verify option doesn’t throw any warnings and errors. If the verify option throws any errors, then it is strongly suggested that you have to first rectify the issues then only go for TPC creation especially in case of warnings as it is a common practice to overlook the warnings.   If you choose the create TPC option, then it will start the process of creating a Team Project Collection  and once its completed you can check the status of configuration different components  during Team Project Collection. You can see in below screen that all the components are configured successfully.   In next screen, you can find the location of log file created for this Team Project Creation, this log file is really important in case of Team Project creation failure because it will help you to find  the root cause for the failure. Now, you can see that the New Team Projection (Demo Collection) which was created is now available in Team Foundation Collection tab and its status is Online.   You can now try to connect to this Team Project Collection from Team Explorer. Choose the newly created Team Project Collection and click on connect.     This Team Project Collection is empty because no Team Projects are created yet. Now, you can create the new Team Projects and start working.

    Read the article

  • Why do I recieve multiple warnings of "No running instance of xfce4-panel was found" when logging into Xubuntu?

    - by Fredrik
    I'm running Xubuntu 11.04, the bootup-time is quite fast but when I log in it takes close to a minute before the desktop is displayed, meanwhile I see no activity on the hard drive. When I finally have the desktop I see this notification repeated 10 times: and then this one: In .config/autostart I have these entries $ ls xfce4-settings-helper-autostart.desktop xfce4-clipman-plugin-autostart.desktop xfce-panel.desktop $ cat xfce-panel.desktop [Desktop Entry] Encoding=UTF-8 Version=0.9.4 Type=Application Name=xfce4-panel Comment= Exec=xfce4-panel StartupNotify=false Terminal=false Hidden=false I need some assistance to locate the slow startup, which logs to look at etc. And then this annoying message about xfce-panel. Where do I look for from where it is started.

    Read the article

  • An Honest look at SharePoint Web Services

    - by juanlarios
    INTRODUCTION If you are a SharePoint developer you know that there are two basic ways to develop against SharePoint. 1) The object Model 2) Web services. SharePoint object model has the advantage of being quite rich. Anything you can do through the SharePoint UI as an administrator or end user, you can do through the object model. In fact everything that is done through the UI is done through the object model behind the scenes. The major disadvantage to getting at SharePoint this way is that the code needs to run on the server. This means that all web parts, event receivers, features, etc… all of this is code that is deployed to the server. The second way to get to SharePoint is through the built in web services. There are many articles on how to manipulate web services, how to authenticate to them and interact with them. The basic idea is that a remote application or process can contact SharePoint through a web service. Lots has been written about how great these web services are. This article is written to document the limitations, some of the issues and frustrations with working with SharePoint built in web services. Ultimately, for the tasks I was given to , SharePoint built in web services did not suffice. My evaluation of SharePoint built in services was compared against creating my own WCF Services to do what I needed. The current project I'm working on right now involved several "integration points". A remote application, installed on a separate server was to contact SharePoint and perform an task or operation. So I decided to start up Visual Studio and built a DLL and basically have 2 layers of logic. An integration layer and a data layer. A good friend of mine pointed me to SOLID principles and referred me to some videos and tutorials about it. I decided to implement the methodology (although a lot of the principles are common sense and I already incorporated in my coding practices). I was to deliver this dll to the application team and they would simply call the methods exposed by this dll and voila! it would do some task or operation in SharePoint. SOLUTION My integration layer implemented an interface that defined some of the basic integration tasks that I was to put together. My data layer was about the same, it implemented an interface with some of the tasks that I was going to develop. This gave me the opportunity to develop different data layers, ultimately different ways to get at SharePoint if I needed to. This is a classic SOLID principle. In this case it proved to be quite helpful because I wrote one data layer completely implementing SharePoint built in Web Services and another implementing my own WCF Service that I wrote. I should mention there is another layer underneath the data layer. In referencing SharePoint or WCF services in my visual studio project I created a class for every web service call. So for example, if I used List.asx. I created a class called "DocumentRetreival" this class would do the grunt work to connect to the correct URL, It would perform the basic operation of contacting the service and so on. If I used a view.asmx, I implemented a class called "ViewRetrieval" with the same idea as the last class but it would now interact with all he operations in view.asmx. This gave my data layer the ability to perform multiple calls without really worrying about some of the grunt work each class performs. This again, is a classic SOLID principle. So, in order to compare them side by side we can look at both data layers and with is involved in each. Lets take a look at the "Create Project" task or operation. The integration point is described as , "dll is to provide a way to create a project in SharePoint". Projects , in this case are basically document libraries. I am to implement a way in which a remote application can create a document library in SharePoint. Easy enough right? Use the list.asmx Web service in SharePoint. So here we go! Lets take a look at the code. I added the List.asmx web service reference to my project and this is the class that contacts it:  class DocumentRetrieval     {         private ListsSoapClient _service;      d   private bool _impersonation;         public DocumentRetrieval(bool impersonation, string endpt)         {             _service = new ListsSoapClient();             this.SetEndPoint(string.Format("{0}/{1}", endpt, ConfigurationManager.AppSettings["List"]));             _impersonation = impersonation;             if (_impersonation)             {                 _service.ClientCredentials.Windows.ClientCredential.Password = ConfigurationManager.AppSettings["password"];                 _service.ClientCredentials.Windows.ClientCredential.UserName = ConfigurationManager.AppSettings["username"];                 _service.ClientCredentials.Windows.AllowedImpersonationLevel =                     System.Security.Principal.TokenImpersonationLevel.Impersonation;             }     private void SetEndPoint(string p)          {             _service.Endpoint.Address = new EndpointAddress(p);          }          /// <summary>         /// Creates a document library with specific name and templateID         /// </summary>         /// <param name="listName">New list name</param>         /// <param name="templateID">Template ID</param>         /// <returns></returns>         public XmlElement CreateLibrary(string listName, int templateID, ref ExceptionContract exContract)         {             XmlDocument sample = new XmlDocument();             XmlElement viewCol = sample.CreateElement("Empty");             try             {                 _service.Open();                 viewCol = _service.AddList(listName, "", templateID);             }             catch (Exception ex)             {                 exContract = new ExceptionContract("DocumentRetrieval/CreateLibrary", ex.GetType(), "Connection Error", ex.StackTrace, ExceptionContract.ExceptionCode.error);                             }finally             {                 _service.Close();             }                                      return viewCol;         } } There was a lot more in this class (that I am not including) because i was reusing the grunt work and making other operations with LIst.asmx, For example, updating content types, changing or configuring lists or document libraries. One of the first things I noticed about working with the built in services is that you are really at the mercy of what is available to you. Before creating a document library (Project) I wanted to expose a IsProjectExisting method. This way the integration or data layer could recognize if a library already exists. Well there is no service call or method available to do that check. So this is what I wrote:   public bool DocLibExists(string listName, ref ExceptionContract exContract)         {             try             {                 var allLists = _service.GetListCollection();                                return allLists.ChildNodes.OfType<XmlElement>().ToList().Exists(x => x.Attributes["Title"].Value ==listName);             }             catch (Exception ex)             {                 exContract = new ExceptionContract("DocumentRetrieval/GetList/GetListWSCall", ex.GetType(), "Unable to Retrieve List Collection", ex.StackTrace, ExceptionContract.ExceptionCode.error);             }             return false;         } This really just gets an XMLElement with all the lists. It was then up to me to sift through the clutter and noise and see if Document library already existed. This took a little bit of getting used to. Now instead of working with code, you are working with XMLElement response format from web service. I wrote a LINQ query to go through and find if the attribute "Title" existed and had a value of the listname then it would return True, if not False. I didn't particularly like working this way. Dealing with XMLElement responses and then having to manipulate it to get at the exact data I was looking for. Once the check for the DocLibExists, was done, I would either create the document library or send back an error indicating the document library already existed. Now lets examine the code that actually creates the document library. It does what you are really after, it creates a document library. Notice how the template ID is really an integer. Every document library template in SharePoint has an ID associated with it. Document libraries, Image Library, Custom List, Project Tasks, etc… they all he a unique integer associated with it. Well, that's great but the client came back to me and gave me some specifics that each "project" or document library, should have. They specified they had 3 types of projects. Each project would have unique views, about 10 views for each project. Each Project specified unique configurations (auditing, versioning, content types, etc…) So what turned out to be a simple implementation of creating a document library as a repository for a project, turned out to be quite involved.  The first thing I thought of was to create a template for document library. There are other ways you can do this too. Using the web Service call, you could configure views, versioning, even content types, etc… the only catch is, you have to be working quite extensively with CAML. I am not fond of CAML. I can do it and work with it, I just don't like doing it. It is quite touchy and at times it is quite tough to understand where errors were made with CAML statements. Working with Web Services and CAML proved to be quite annoying. The service call would return a generic error message that did not particularly point me to a CAML statement syntax error, or even a CAML error. I was not sure if it was a security , performance or code based issue. It was quite tough to work with. At times it was difficult to work with because of the way SharePoint handles metadata. There are "Names", "Display Name", and "StaticName" fields. It was quite tough to understand at times, which one to use. So it took a lot of trial and error. There are tools that can help with CAML generation. There is also now intellisense for CAML statements in Visual Studio that might help but ultimately I'm not fond of CAML with Web Services.   So I decided on the template. So my plan was to create create a document library, configure it accordingly and then use The Template Builder that comes with the SharePoint SDK. This tool allows you to create site templates, list template etc… It is quite interesting because it does not generate an STP file, it actually generates an xml definition and a feature you can activate and make that template available on a site or site collection. The first issue I experienced with this is that one of the specifications to this template was that the "All Documents" view was to have 2 web parts on it. Well, it turns out that using the template builder , it did not include the web parts as part of the list template definition it generated. It backed up the settings, the views, the content types but not the custom web parts. I still decided to try this even without the web parts on the page. This new template defined a new Document library definition with a unique ID. The problem was that the service call accepts an int but it only has access to the built in library int definitions. Any new ones added or created will not be available to create. So this made it impossible for me to approach the problem this way.     I should also mention that one of the nice features about SharePoint is the ability to create list templates, back them up and then create lists based on that template. It can all be done by end user administrators. These templates are quite unique because they are saved as an STP file and not an xml definition. I also went this route and tried to see if there was another service call where I could create a document library based no given template name. Nope! none.      After some thinking I decide to implement a WCF service to do this creation for me. I was quite certain that the object model would allow me to create document libraries base on a template in which an ID was required and also templates saved as STP files. Now I don't want to bother with posting the code to contact WCF service because it's self explanatory, but I will post the code that I used to create a list with custom template. public ServiceResult CreateProject(string name, string templateName, string projectId)         {             string siteurl = SPContext.Current.Site.Url;             Guid webguid = SPContext.Current.Web.ID;                        using (SPSite site = new SPSite(siteurl))             {                 using (SPWeb rootweb = site.RootWeb)                 {                     SPListTemplateCollection temps = site.GetCustomListTemplates(rootweb);                     ProcessWeb(siteurl, webguid, web => Act_CreateProject(web, name, templateName, projectId, temps));                 }//SpWeb             }//SPSite              return _globalResult;                   }         private void Act_CreateProject(SPWeb targetsite, string name, string templateName, string projectId, SPListTemplateCollection temps) {                         var temp = temps.Cast<SPListTemplate>().FirstOrDefault(x => x.Name.Equals(templateName));             if (temp != null)             {                             try                 {                                         Guid listGuid = targetsite.Lists.Add(name, "", temp);                     SPList newList = targetsite.Lists[listGuid];                     _globalResult = new ServiceResult(true, "Success", "Success");                 }                 catch (Exception ex)                 {                     _globalResult = new ServiceResult(false, (string.IsNullOrEmpty(ex.Message) ? "None" : ex.Message + " " + templateName), ex.StackTrace.ToString());                 }                                       }        private void ProcessWeb(string siteurl, Guid webguid, Action<SPWeb> action) {                        using (SPSite sitecollection = new SPSite(siteurl)) {                 using (SPWeb web = sitecollection.AllWebs[webguid]) {                     action(web);                 }                     }                  } This code is actually some of the code I implemented for the service. there was a lot more I did on Project Creation which I will cover in my next blog post. I implemented an ACTION method to process the web. This allowed me to properly dispose the SPWEb and SPSite objects and not rewrite this code over and over again. So I implemented a WCF service to create projects for me, this allowed me to do a lot more than just create a document library with a template, it now gave me the flexibility to do just about anything the client wanted at project creation. Once this was implemented , the client came back to me and said, "we reference all our projects with ID's in our application. we want SharePoint to do the same". This has been something I have been doing for a little while now but I do hope that SharePoint 2010 can have more of an answer to this and address it properly. I have been adding metadata to SPWebs through property bag. I believe I have blogged about it before. This time it required metadata added to a document library. No problem!!! I also mentioned these web parts that were to go on the "All Documents" View. I took the opportunity to configure them to the appropriate settings. There were two settings that needed to be set on these web parts. One of them was a Project ID configured in the webpart properties. The following code enhances and replaces the "Act_CreateProject " method above:  private void Act_CreateProject(SPWeb targetsite, string name, string templateName, string projectId, SPListTemplateCollection temps) {                         var temp = temps.Cast<SPListTemplate>().FirstOrDefault(x => x.Name.Equals(templateName));             if (temp != null)             {                 SPLimitedWebPartManager wpmgr = null;                               try                 {                                         Guid listGuid = targetsite.Lists.Add(name, "", temp);                     SPList newList = targetsite.Lists[listGuid];                     SPFolder rootFolder = newList.RootFolder;                     rootFolder.Properties.Add(KEY, projectId);                     rootFolder.Update();                     if (rootFolder.ParentWeb != targetsite)                         rootFolder.ParentWeb.Dispose();                     if (!templateName.Contains("Natural"))                     {                         SPView alldocumentsview = newList.Views.Cast<SPView>().FirstOrDefault(x => x.Title.Equals(ALLDOCUMENTS));                         SPFile alldocfile = targetsite.GetFile(alldocumentsview.ServerRelativeUrl);                         wpmgr = alldocfile.GetLimitedWebPartManager(PersonalizationScope.Shared);                         ConfigureWebPart(wpmgr, projectId, CUSTOMWPNAME);                                              alldocfile.Update();                     }                                        if (newList.ParentWeb != targetsite)                         newList.ParentWeb.Dispose();                     _globalResult = new ServiceResult(true, "Success", "Success");                 }                 catch (Exception ex)                 {                     _globalResult = new ServiceResult(false, (string.IsNullOrEmpty(ex.Message) ? "None" : ex.Message + " " + templateName), ex.StackTrace.ToString());                 }                 finally                 {                     if (wpmgr != null)                     {                         wpmgr.Web.Dispose();                         wpmgr.Dispose();                     }                 }             }                         }       private void ConfigureWebPart(SPLimitedWebPartManager mgr, string prjId, string webpartname)         {             var wp = mgr.WebParts.Cast<System.Web.UI.WebControls.WebParts.WebPart>().FirstOrDefault(x => x.DisplayTitle.Equals(webpartname));             if (wp != null)             {                           (wp as ListRelationshipWebPart.ListRelationshipWebPart).ProjectID = prjId;                 mgr.SaveChanges(wp);             }         }   This Shows you how I was able to set metadata on the document library. It has to be added to the RootFolder of the document library, Unfortunately, the SPList does not have a Property bag that I can add a key\value pair to. It has to be done on the root folder. Now everything in the integration will reference projects by ID's and will not care about names. My, "DocLibExists" will now need to be changed because a web service is not set up to look at property bags.  I had to write another method on the Service to do the equivalent but with ID's instead of names.  The second thing you will notice about the code is the use of the Webpartmanager. I have seen several examples online, and also read a lot about memory leaks, The above code does not produce memory leaks. The web part manager creates an SPWeb, so just dispose it like I did. CONCLUSION This is a long long post so I will stop here for now, I will continue with more comparisons and limitations in my next post. My conclusion for this example is that Web Services will do the trick if you can suffer through CAML and if you are doing some simple operations. For Everything else, there's WCF! **** fireI apologize for the disorganization of this post, I was on a bus on a 12 hour trip to IOWA while I wrote it, I was half asleep and half awake, hopefully it makes enough sense to someone.

    Read the article

  • Hack Extension Files to Make Them Version-Compatible for Firefox

    - by Asian Angel
    A well known drawback in using Firefox is the problem with extension compatibility when a new major version is released. Whether it is for a new extension that you are trying for the first time or an old favorite we have a way to get those extensions working for you again. There are multiple reasons why you might want to choose this method to fix a non-compatible extension: You are uncomfortable with tweaking the “about:config” settings You prefer to maintain the original “about:config” settings in a pristine state and like having compatibility checking active You are looking to gain some “geek cred” Keep in mind that most extensions will work perfectly well with a new version of Firefox and simply have the “version compatibility number” problem. But once in a while there may be one that needs to have some work done on it by the extension’s author. The Problem Here is a perfect example of everyone’s least favorite “extension message”. This is the last thing that you need when all that you want is for your favorite extension (or a new one) to work on a fresh clean install. Note: This works nicely to “replace” non-compatible extensions already present in your browser if you are simply upgrading. Hacking the XPI File For this procedure you will need to manually download the extension to your hard-drive (right click on the extension’s “Install Button” and select “Save As”). Once you have done that you are ready to start hacking the extension. For our example we chose the “GCal Popup Extension”. The best thing to do is place the extension in a new folder (i.e. the Desktop or other convenient location) then unzip it just the same way that you would with any regular zip file. Once it is unzipped you will see the various folders and files that were in the “xpi file” (we had four files here but depending on the extension the number may vary). There is only one file that you need to focus on…the “install.rdf” file. Note: At this point you should move the original extension file to a different location (i.e. outside of the folder) so that it is no longer present. Open the file in “Notepad” so that you can change the number for the “maxVersion”. Here the number is listed as “3.5.*” but we needed to make it higher… Replacing the “5” with a “7” is all that we needed to do. Once you have entered your new “maxVersion” number save the file. At this point you will need to re-zip all of the files back into a single file. Make certain that you “create” a file with the “.zip file extension” otherwise this will not work. Once you have the new zip file created you will need to rename the entire file including the “file extension”. For our example we copied and pasted the original extension name. Once you have changed the name click outside of the “text area”. You will see a small message window like this asking for confirmation…click “Yes” to finish the process. Now your modified/updated extension is ready to install. Drag the extension into your browser to install it and watch that wonderful “Restart to complete the installation.” message appear. As soon as your browser starts you can check the “Add-ons Manager Window” and see the version compatibility numbers for the extension. Looking very very nice! And just like that your extension should be up and running without any problems. Conclusion If you are looking to try something new, gain some geek cred, or just want to keep your Firefox install as close to the original condition as possible this method should get those extensions working nicely for you again. Similar Articles Productive Geek Tips Make Firefox Extensions Compatible After Firefox Update Breaks Them For No Good ReasonCheck Extension Compatibility for Upcoming Firefox ReleasesFirefox 3.6 Release Candidate Available, Here’s How to Fix Your Incompatible ExtensionsHow To Force Extension Compatibility with Firefox 3.6+Test and Report Add-on Compatibility in Firefox TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional 15 Great Illustrations by Chow Hon Lam Easily Sync Files & Folders with Friends & Family Amazon Free Kindle for PC Download Stretch popurls.com with a Stylish Script (Firefox) OldTvShows.org – Find episodes of Hitchcock, Soaps, Game Shows and more Download Microsoft Office Help tab

    Read the article

  • Browsing Your ADF Application Module Pooling Params with WLST

    - by Duncan Mills
    In ADF 11g you can of course use Enterprise Manager (EM) to browse and configure the settings used by ADF Business Components  Application Modules, as shown here for one of my sample deployed applications. This screen you can access from the EM homepage by pulling down the Application Deployment menu, and then ADF > Configure ADF Business Components. Then select the profile that you are actually using (Hint: look in the DataBindings.cpx file to work this out - probably the "Local" version unless you've explicitly changed it. )So, from this screen you can change the pooling parameters and the world is good. But what if you don't have EM installed? In that case you can use the WebLogic scripting capabilities to view (and Update) the MBean Properties. Explanation The pooling parameters and many others are handled through Message Driven Beans that are created for the deployed application in the server. In the case of the ADF BC pooling parameters, this MBean will combine the configuration deployed as part of the application, along with any overrides defined as -D environement commands on the JVM startup for the application server instance. Using WLST to Browse the Bean ValuesFor our purposes here I'm doing this interactively, although you can also write a script or write Java to achieve the same thing.Step 0: Before You Start You will need the followingAccess to the console on the machine that is running the serverThe WebLogic Admin username and password (I'll use weblogic/password as my example here - yours will be different)The name of the deployed application (in this example FMWdh_application1)The package path to the bc4j.xcfg file (in this example oracle.demo.fmwdh.model.service.common.bc4j.xcfg) This is based on the default path for your model project so it shoudl be fairly easy to work out.The BC configuration your AM is actually running with (look in the DataBindings.cpx for that. In this example DealHelpServiceDeployed is the profile being used..)Step 1: Start the WLST consoleTo start at the beginning, you need to run the WLST command but that needs a little setup:Change to the wlserver_10.3/server/bin directory e.g. under your Fusion Middleware Home[oracle@mymachine] cd /home/oracle/FMW_R1/wlserver_10.3/server/binSet your environment using the setWLSEnv script. e.g. on Oracle Enterprise Linux:[oracle@mymachine bin] source setWLSEnv.shStart the WLST interactive console[oracle@mymachine bin] java weblogic.WLSTInitializing WebLogic Scripting Tool (WLST) ...Welcome to WebLogic Server Administration Scripting ShellType help() for help on available commandswls:/offline> Step 2:Enter the WLST commandsConnect to the server wls:> connect('weblogic','password')Change to the Custom root, this is where the AMPooling MBeans are registered wls:> custom()Change to the b4j MBean directorywls:> cd ('oracle.bc4j.mbean.config')Work out the correct directory for the AM configuration you need. This is the difficult bit, not because it's hard to do, but because the names are long. The structure here is such that every child MBean is displayed at the same level as the parent, so for each deployed application there will be many directories shown. In fact, do an ls() command here and you'll see what I mean. Each application will have one MBean for the app as a whole, and then for each deployed configuration in the .xcfg file you'll see: One for the config entry itself, and then one each for Security, DB Connection and AM Pooling. So if you deploy an app with just one configuration you'll see 5 directories, if it has two configurations in the .xcfg you'll see 9 and so on.The directory you are looking for will contain those bits of information you gathered in Step 0, specifically the Application Name, the configuration you are using and the xcfg name: First of all narrow your list to just those directories returned from the ls() command that begin oracle.bc4j.mbean.config:name=AMPool. These identify the AM pooling MBeans for all the deployed applications. Now look for the correct application name e.g. Application=FMWdh_application1The config setting in that sub-list should already be correct and match what you expect e.g. oracle.bc4j.mbean.config=oracle.demo.fmwdh.model.service.common.bc4j.xcfgFinally look for the correct value for the AppModuleConfigType e.g. oracle.bc4j.mbean.config.AppModuleConfigType=DealHelpServiceDeployedNow you have identified the correct directory name, change to that (keep the name on one line of course - I've had to split it across lines here for clarity:wls:> cd ('oracle.bc4j.mbean.config:name=AMPool,     type=oracle.bc4j.mbean.config.AppModuleConfigType.AMPoolType,    oracle.bc4j.mbean.config=oracle.demo.fmwdh.model.service.common.bc4j.xcfg,    Application=FMWdh_application1,    oracle.bc4j.mbean.config.AppModuleConfigType=DealHelpServiceDeployed') Now you can actually view the parameter values with a simple ls() commandwls:> ls()And here's the output in which you can view the realtime values of the various pool settings: -rw- AmpoolConnectionstrategyclass oracle.jbo.common.ampool.DefaultConnectionStrategy -rw- AmpoolDoampooling true -rw- AmpoolDynamicjdbccredentials false -rw- AmpoolInitpoolsize 2 -rw- AmpoolIsuseexclusive true -rw- AmpoolMaxavailablesize 40 -rw- AmpoolMaxinactiveage 600000 -rw- AmpoolMaxpoolsize 4096 -rw- AmpoolMinavailablesize 2 -rw- AmpoolMonitorsleepinterval 600000 -rw- AmpoolResetnontransactionalstate true -rw- AmpoolSessioncookiefactoryclass oracle.jbo.common.ampool.DefaultSessionCookieFactory -rw- AmpoolTimetolive 3600000 -rw- AmpoolWritecookietoclient false -r-- ConfigMBean true -rw- ConnectionPoolManager oracle.jbo.server.ConnectionPoolManagerImpl -rw- Doconnectionpooling false -rw- Dofailover false -rw- Initpoolsize 0 -rw- Maxpoolcookieage -1 -rw- Maxpoolsize 4096 -rw- Poolmaxavailablesize 25 -rw- Poolmaxinactiveage 600000 -rw- Poolminavailablesize 5 -rw- Poolmonitorsleepinterval 600000 -rw- Poolrequesttimeout 30000 -rw- Pooltimetolive -1 -r-- ReadOnly false -rw- Recyclethreshold 10 -r-- RestartNeeded false -r-- SystemMBean false -r-- eventProvider true -r-- eventTypes java.lang.String[jmx.attribute.change] -r-- objectName oracle.bc4j.mbean.config:name=AMPool,type=oracle.bc4j.mbean.config.AppModuleConfigType.AMPoolType,oracle.bc4j.mbean.config=oracle.demo.fmwdh.model.service.common.bc4j.xcfg,Application=FMWdh_application1,oracle.bc4j.mbean.config.AppModuleConfigType=DealHelpServiceDeployed -rw- poolClassName oracle.jbo.common.ampool.ApplicationPoolImpl Thanks to Brian Fry on the JDeveloper PM Team who did most of the work to put this sequence of steps together with me badgering him over his shoulder.

    Read the article

  • "cannot open file userpref.blend@ for writing: Permission denied" in blender

    - by ganezdragon
    I'm using blender 2.69, installed via software centre, and when I save my user preference through File - User Preferences and click on "Save User Settings" there is a message "cannot open file /home/ganez/.config/blender/2.69/config/userpref.blend@ for writing permission denied" I have checked to the path /home/ganez/.config/blender/2.69/config/ and there is no userpref.blend file present. PS: I think this has something to do with file permission for that config folder and I have no idea on how to use the chmod command. So any advise? Thank you in advance.

    Read the article

  • Syntax Recognition for XML-Based Languages in Oracle JDeveloper

    - by Ramkumar Menon
      @Thanks Jeffrey Stephenson If you are looking at using any one of the new XML Based languages, lets say a docbook xml, or xproc, or what not, you can make use of JDeveloper's syntax highlighting and completion insight feature to ease out those extra keystrokes. All you need is a URL/local copy of the XML Schema for the language. Once you have, you can register it via Tools --> Preferences --> XML Schemas.   Remember to provide a new extension name [Using a default .xml extension did not work for me.] I provided my own extension .dbk for my docbook files. Once you save these settings, you can create new files that conform to the schema, and you get validation/completion insight/prompting for free.      

    Read the article

  • Sort Your Emails by Conversation in Outlook 2010

    - by Matthew Guay
    Do you prefer the way Gmail sorts your emails by conversation?  Here’s how you can use this handy feature in Outlook 2010 too. One exciting new feature in Outlook 2010 is the ability to sort and link your emails by conversation.  This makes it easier to know what has been discussed in emails, and helps you keep your inbox more tidy.  Some users don’t like their emails linked into conversations, and in the final release of Outlook 2010 it is turned off by default.  Since this is a new feature, new users may overlook it and never know it’s available.  Here’s how you can enable conversation view and keep your email conversations accessible and streamlined. Activate Conversation View By default, your inbox in Outlook 2010 will look much like it always has in Outlook…a list of individual emails. To view your emails by conversation, select the View tab and check the Show as Conversations box on the top left. Alternately, click on the Arrange By tab above your emails, and select Show as Conversations. Outlook will ask if you want to activate conversation view in only this folder or all folders.  Choose All folders to view all emails in Outlook in conversations. Outlook will now resort your inbox, linking emails in the same conversation together.  Individual emails that don’t belong to a conversation will look the same as before, while conversations will have a white triangle carrot on the top left of the message title.  Select the message to read the latest email in the conversation. Or, click the triangle to see all of the messages in the conversation.  Now you can select and read any one of them. Most email programs and services include the previous email in the body of an email when you reply.  Outlook 2010 can recognize these previous messages as well.  You can navigate between older and newer messages from popup Next and Previous buttons that appear when you hover over the older email’s header.  This works both in the standard Outlook preview pane and when you open an email in its own window.   Edit Conversation View Settings Back in the Outlook View tab, you can tweak your conversation view to work the way you want.  You can choose to have Outlook Always Expand Conversations, Show Senders Above the Subject, and to Use Classic Indented View.  By default, Outlook will show messages from other folders in the conversation, which is generally helpful; however, if you don’t like this, you can uncheck it here.  All of these settings will stay the same across all of your Outlook accounts. If you choose Indented View, it will show the title on the top and then an indented message entry underneath showing the name of the sender. The Show Senders Above the Subject view makes it more obvious who the email is from and who else is active in the conversation.  This is especially useful if you usually only email certain people about certain topics, making the subject lines less relevant. Or, if you decide you don’t care for conversation view, you can turn it off by unchecking the box in the View tab as above. Conclusion Although it may take new users some time to get used to, conversation view can be very helpful in keeping your inbox organized and letting important emails stay together.  If you’re a Gmail user syncing your email account with Outlook, you may find this useful as it makes Outlook 2010 work more like Gmail, even when offline. If you’d like to sync your Gmail account with Outlook 2010, check out our articles on syncing it with POP3 and IMAP. Similar Articles Productive Geek Tips Automatically Move Daily Emails to Specific Folders in OutlookQuickly Clean Your Inbox in Outlook 2003/2007Find Emails With Attachments with Outlook 2007’s Instant SearchAdd Your Gmail Account to Outlook 2010 using POPSchedule Auto Send & Receive in Microsoft Outlook TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 VMware Workstation 7 Acronis Online Backup The iPod Revolution Ultimate Boot CD can help when disaster strikes Windows Firewall with Advanced Security – How To Guides Sculptris 1.0, 3D Drawing app AceStock, a Tiny Desktop Quote Monitor Gmail Button Addon (Firefox)

    Read the article

  • Create Second Web Application using the Default port 80 In SharePoint2010

    - by ybbest
    As a SharePoint developer, one of the common tasks is to create SharePoint Web Application. In this post I will show you how to create second Web Application using the default port 80 in SharePoint2010.You need to follow the steps below. 1. Go to Central Admin => Application Management =>Manage web applications and click new Web Application 2. I choose YBBEST as my IIS site name and host header name, change the port number to 80 and leave the rest settings as default. 3. After the web application creation wizard completes, add an entry in the host file located at C:\Windows\System32\Drivers\etc\hosts . 4. Create a root site collection for the new web application. After the site collection is created , you can browse to the site collection using URL http://ybbest.

    Read the article

  • Coherence Configuration For Multiple HA SOA Domains

    - by [email protected]
    The HA document does not require the specificaiton of wka port and localport for coherence, but if you would like to create multiple SOA HA domains, You have to use different coherence settings for these domains, For SOA Domain 1 , set the following properties in the weblogic server startup argument. -Dtangosol.coherence.wka1=apphost1vhn1 -Dtangosol.coherence.wka1.port=<port1>-Dtangosol.coherence.wka2=apphost2vhn1  -Dtangosol.coherence.wka2.port=<port1>-Dtangosol.coherence.localhost=apphost1vhn1 -Dtangosol.coherence.localport=<port1> For SOA Domain 2 , set the following properties in the weblogic server startup argument. -Dtangosol.coherence.wka1=apphost1vhn1 -Dtangosol.coherence.wka1.port=<port2>-Dtangosol.coherence.wka2=apphost2vhn1  -Dtangosol.coherence.wka2.port=<port2>-Dtangosol.coherence.localhost=apphost1vhn1 -Dtangosol.coherence.localport=<port2> <port1> and <port2> must be different.  

    Read the article

  • Running Jetty under Windows Azure Using RoleEntryPoint in a Worker Role

    - by Shawn Cicoria
    This post is built upon the work of Mario Kosmiskas and David C. Chou’s prior postings – from here: http://blogs.msdn.com/b/mariok/archive/2011/01/05/deploying-java-applications-in-azure.aspx  http://blogs.msdn.com/b/dachou/archive/2010/03/21/run-java-with-jetty-in-windows-azure.aspx As Mario points out in his post, when you need to have more control over the process that starts, it generally is better left to a RoleEntryPoint capability that as of now, requires the use of a CLR based assembly that is deployed as part of the package to Azure. There were things I liked especially about Mario’s post – specifically, the ability to pull down the JRE and Jetty runtimes at role startup and instantiate the process using the extracted bits.  The way Mario initialized the java process (and Jetty) was to take advantage of a role startup task configured as part of the service definition.  This is a great quick way to kick off processes or tasks prior to your role entry point.  However, if you need access to service configuration values or role events, that’s where RoleEntryPoint comes in.  For this PoC sample I moved the logic for retrieving the bits for the jre and jetty to the worker roles OnStart – in addition to moving the process kickoff to the OnStart method.  The Run method at this point is there to loop and just report the status of the java process. Beyond just making things more parameterized, both Mario’s and David’s articles still form the essence of the approach. The solution that accompanies this post provides all the necessary .NET based Visual Studio project.  In addition, you’ll need: 1. Jetty 7 runtime http://www.eclipse.org/jetty/downloads.php 2. JRE http://www.oracle.com/technetwork/java/javase/downloads/index.html Once you have these the first step is to create archives (zips) of the distributions.  For this PoC, the structure of the archive requires that the root of the archive looks as follows: JRE6.zip jetty---.zip Upload the contents to a storage container (block blob), and for this example I used /archives as the location.  The service configuration has several settings that allow, which is the advantage of using RoleEntryPoint, the ability to provide these things via native configuration support from Azure in a worker role. Storage Explorer You can use development storage for testing this out – the zipped version of the solution is configured for development storage.  When you’re ready to deploy, you update the two settings – 1 for diagnostics and the other for the storage container where the /archives are going to be stored. <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="HostedJetty" osFamily="2" osVersion="*"> <Role name="JettyWorker"> <Instances count="1" /> <ConfigurationSettings> <!--<Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="DefaultEndpointsProtocol=https;AccountName=<accountName>;AccountKey=<accountKey>" />--> <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" /> <Setting name="JettyArchive" value="jetty-distribution-7.3.0.v20110203b.zip" /> <Setting name="StartRole" value="true" /> <Setting name="BlobContainer" value="archives" /> <Setting name="JreArchive" value="jre6.zip" /> <!--<Setting name="StorageCredentials" value="DefaultEndpointsProtocol=https;AccountName=<accountName>;AccountKey=<accountKey>"/>--> <Setting name="StorageCredentials" value="UseDevelopmentStorage=true" />   For interacting with Storage you can use several tools – one tool that I like is from the Windows Azure CAT team located here: http://appfabriccat.com/2011/02/exploring-windows-azure-storage-apis-by-building-a-storage-explorer-application/  and shown in the prior picture At runtime, during role initialization and startup, Azure will call into your RoleEntryPoint.  At that time the code will do a dynamic pull of the 2 archives and extract – using the Sharp Zip Lib <link> as Mario had demonstrated in his sample.  The only different here is the use of CLR code vs. PowerShell (which is really CLR, but that’s another discussion). At this point, once the 2 zips are extracted, the Role’s file system looks as follows: Worker Role approot From there, the OnStart method (which also does the download and unzip using a simple StorageHelper class) kicks off the Java path and now you have Java! Task Manager Jetty Sample Page A couple of things I’m working on to enhance this is to extract the jre and jetty bits not to the appRoot but to a resource location defined as part of the service definition. ServiceDefinition.csdef <?xml version="1.0" encoding="utf-8"?> <ServiceDefinition name="HostedJetty" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition"> <WorkerRole name="JettyWorker"> <Imports> <Import moduleName="Diagnostics" /> <Import moduleName="RemoteAccess" /> <Import moduleName="RemoteForwarder" /> </Imports> <Endpoints> <InputEndpoint name="JettyPort" protocol="tcp" port="80" localPort="8080" /> </Endpoints> <LocalResources> <LocalStorage name="Archives" cleanOnRoleRecycle="false" sizeInMB="100" /> </LocalResources>   As the concept matures a bit, being able to update dynamically the content or jar files as part of a running java solution is something that is possible through continued enhancement of this simple model. The Visual Studio 2010 Solution is located here: HostingJavaSln_NDA.zip

    Read the article

  • Can't download web photo albums to Picasa

    - by Arcadie
    Someone has shared a Picasa web album (Limited, anyone with the link), but I can't download it to Picasa. The following alert appears: Firefox doesn't know how to open this address, because the protocol (picasa) isn't associated with any program. I have Picasa 3.0.0 installed on Ubuntu 11.04, I remember it saying something about registering the picasa protocol with Firefox during the installation. I have Firefox 6.0.2, and these settings are present in about:config network.protocol-handler.app.picasa;/usr/bin/picasa network.protocol-handler.expose.picasa;true network.protocol-handler.external.picasa;true Picasa is located here: $ which picasa /usr/bin/picasa Is there something I can do to make this work? PS: I hope this is not off-topic here, and I can't find the "picasa" tag. Could someone please add it, if appropriate?

    Read the article

  • How to Automate your Database Documentation

    - by Jonathan Hickford
    In my previous post, “Automating Deployments with SQL Compare command line” I looked at how teams can automate the deployment and post deployment validation of SQL Server databases using the command line versions of Red Gate tools. In this post I’m looking at another use for the command line tools, namely using them to generate up-to-date documentation with every database change. There are many reasons why up-to-date documentation is valuable. For example when somebody new has to work on or administer a database for the first time, or when a new database comes into service. Having database documentation reduces the risks of making incorrect decisions when making changes. Documentation is very useful to business intelligence analysts when writing reports, for example in SSRS. There are a couple of great examples talking about why up to date documentation is valuable on this site:  Database Documentation – Lands of Trolls: Why and How? and Database Documentation Using SQL Doc. The short answer is that it can save you time and reduce risk when you need that most! SQL Doc is a fast simple tool that automatically generates database documentation. It can create documents in HTML, Word or pdf files. The documentation contains information about object definitions and dependencies, along with any other information you want to associate with each object. The SQL Doc GUI, which is included in Red Gate’s SQL Developer Bundle and SQL Toolbelt, allows you to add additional notes to objects, and customise which objects are shown in the docs.  These settings can be saved as a .sqldoc project file. The SQL Doc command line can use this project file to automatically update the documentation every time the database is changed, ensuring that documentation that is always up to date. The simplest way to keep documentation up to date is probably to use a scheduled task to run a script every day. However if you have a source controlled database, or are using a Continuous Integration (CI) server or a build server, it may make more sense to use that instead. If  you’re using SQL Source Control or SSDT Database Projects to help version control your database, you can automatically update the documentation after each change is made to the source control repository that contains your database. To get this automation in place,  you can use the functionality of a Continuous Integration (CI) server, which can trigger commands to run when a source control repository has changed. A CI server will also capture and save the documentation that is created as an artifact, so you can always find the exact documentation for a specific version of the database. This forms an always up to date data dictionary. If you don’t already have a CI server in place there are several you can use, such as the free open source Jenkins or the free starter editions of TeamCity. I won’t cover setting these up in this article, but there is information about using CI servers for automating database tasks on the Red Gate Database Delivery webpage. You may be interested in Red Gate’s SQL CI utility (part of the SQL Automation Pack) which is an easy way to update a database with the latest changes from source control. The PowerShell example below shows how to create the documentation from a database. That database might be your integration database or a shared development database that is always up to date with the latest changes. $serverName = "server\instance" $databaseName = "databaseName" # If you want to document multiple databases use a comma separated list $userName = "username" $password = "password" # Path to SQLDoc.exe $SQLDocPath = "C:\Program Files (x86)\Red Gate\SQL Doc 3\SQLDoc.exe" $arguments = @( "/server:$($serverName)", "/database:$($databaseName)", "/username:$($userName)", "/password:$($password)", "/filetype:html", "/outputfolder:.", # "/project:$args[0]", # If you already have a .sqldoc project file you can pass it as an argument to this script. Values in the project will be overridden with any options set on the command line "/name:$databaseName Report", "/copyrightauthor:$([Environment]::UserName)" ) write-host $arguments & $SQLDocPath $arguments There are several options you can set on the command line to vary how your documentation is created. For example, you can document multiple databases or exclude certain types of objects. In the example above, we set the name of the report to match the database name, and use the current Windows user as the documentation author. For more examples of how you can customise the report from the command line please see the SQL Doc command line documentation If you already have a .sqldoc project file, or wish to further customise the report by including or excluding specific objects, you can use this project on the command line. Any settings you specify on the command line will override the defaults in the project. For details of what you can customise in the project please see the SQL Doc project documentation. In the example above, the line to use a project is commented out, but you can uncomment this line and then pass a path to a .sqldoc project file as an argument to this script.  Conclusion Keeping documentation about your databases up to date is very easy to set up using SQL Doc and PowerShell. By using a CI server to run this process you can trigger the documentation to be run on every change to a source controlled database, and keep historic documentation available. If you are considering more advanced database automation, e.g. database unit testing, change script generation, deploying to large numbers of targets and backup/verification, please email me at [email protected] for further script samples or if you have any questions.

    Read the article

  • Google Talk Plugin in GMail on MacBook 2,1

    - by jrc03c
    I'd like to use the chat section in GMail to make phone calls. I've downloaded and installed the Google Talk plugin, and it acts like it knows what it's doing. But when I try to make calls, the internal laptop mic doesn't work at all (i.e., no one on the other end can hear me). In the GMail chat settings, I've tried selecting "Default Device" for the microphone, as well as "Internal Audio Analog Stereo." No matter which setting I try, none seem to work. As I said at the top, this is only a problem in Ubuntu; it works just fine in OSX and Windows (which means that yes, my Google Voice account is properly configured). Here are my tech specs: Ubuntu 10.10 Kernel Linux 2.6.35-24-generic Gnome 2.32.0 Google Chrome 8.0.552.237 Google Talk Plugin (google-talkplugin) 1.8.0.0-1 MacBook (2,1) w/ internal microphone Any help will be greatly appreciated! Thanks!

    Read the article

  • Ubuntu no longer prompts for root privilege (but doesn't give it either)

    - by Elad Avron
    So on 14.04 LTS I was playing around with some settings to solve another problem, and somehow managed to screw things up and now Ubuntu no longer asks for root privileges before trying to perform administrative actions. The catch is that my user does NOT have them by default, which means those actions ALL fail. I can still run "sudo " from terminal and it'll ask for my password and work fine, but any GUI that requires root just fails without asking anything. Any ideas what I did wrong and how to solve it? Thanks in advance.

    Read the article

< Previous Page | 258 259 260 261 262 263 264 265 266 267 268 269  | Next Page >