Search Results

Search found 5490 results on 220 pages for 'shadow folders'.

Page 95/220 | < Previous Page | 91 92 93 94 95 96 97 98 99 100 101 102  | Next Page >

  • Why is apt-cache so slow?

    - by Damn Terminal
    After upgrade to Trusty (14.04) from Saucy (13.10), all apt operations are very slow. Even those that do not include downloading anything, or connecting to any servers. For example, displaying the apt policy # time apt-cache policy [...] real 0m8.951s user 0m5.069s sys 0m3.861s takes almost ten seconds! Mostly a weird lag right after issuing the command. And it's the same even if I issue the same command again. On another system it doesn't take a tenth of a second real 0m0.096s user 0m0.070s sys 0m0.023s The other system is a little beefier but there was no noticeable difference before the upgrade. It's the same with apt-get, anything apt-related. How do I find out the source of this lag and fix it? Additional info: # cat /etc/nsswitch.conf # /etc/nsswitch.conf # # Example configuration of GNU Name Service Switch functionality. # If you have the `glibc-doc-reference' and `info' packages installed, try: # `info libc "Name Service Switch"' for information about this file. passwd: compat group: compat shadow: compat hosts: files dns networks: files protocols: db files services: db files ethers: db files rpc: db files netgroup: nis BTW is my understanding of how apt-cache works correct? It doesn't make any network connections when I run apt-cache policy, right? In case I'm wrong and it matters, here are my sources https://gist.github.com/anonymous/02920270ff68e23fc3ec

    Read the article

  • Areas of support needed when attempting to roll out a new software system

    In general, I think most people tend to be resistant to new systems or even change because they fear the unknown. Change means that their normal routine will be interrupted until they can learn to conform to the new routine due to the fact that it has transformed to the old routine. In addition, the feeling of failure is also generates a resistance to change. Why would a worker want to move from a process that has worked successfully for them in the past? Their fears over shadow any benefits a change in a new system or business process will bring to their work life. Areas of support needed when attempting to roll out a new software system: Executive/Upper Management Support If there is no support from the top of an organization how will employees be supportive of the new system? Proper Training Employees need to train on a new system prior to its rollout. The more training employee’s receive on any new system will directly impact how comfortable they will be with the system and are more accepting of the change because they can see how the changes will benefit them. Employee Incentives One way to re-enforce the need for employees to use a new system is to offer incentives to ensure that the system will be used. Employee Discipline/Termination If employees are adamantly refusing to use the new system after several warnings then they need to be formally reprimanded.  If this does not work the employer is forced to replace the employees.

    Read the article

  • Account not getting completed deleted within linux

    - by lbanz
    I've got a nas box running some flavour of linux 2.6.31.8.nv+v2 with an arm processor. It has got a samba share called 'all' that has full read write access to everyone. However one Windows machine cannot access it without prompting for authentication and I found out from the logs that the windows account matches a local account on the nas box. What I then went to do is delete the local account on the nas. I can see that /home,/etc/password + /etc/shadow the account doesn't exist anymore. However the samba logs, shows that it thinks it is still there as it says account is disabled. I've tried rebooting both nas + windows box. Is there somewhere else that it stores account information? I logged on with a different account on that Windows machine and I can access the share fine. The smb logs shows that it can't find the user and then allows anonymous access.

    Read the article

  • Trouble with backup on SBS2008

    - by MemLeak
    We have an SBS2008 installation, which has a backup task. It creates a backup twice a day onto external storage. This worked for a long time. But the backup can't be performed now. The last valid backup was before some Exchange updates, SBS Update Rollup 8 and SharePoint updates were installed. The only error in Event Logs is for VSS (the Volume Shadow Service), which I only have in German. Things I've done: restart the server chkdsk /r /f for all volumes restart the VSS service googling manual backup Nothing's helped. How can I set up the backup again?

    Read the article

  • How do I improve terrain rendering batch counts using DirectX?

    - by gamer747
    We have determined that our terrain rendering system needs some work to minimize the number of batches being transferred to the GPU in order to improve performance. I'm looking for suggestions on how best to improve what we're trying to accomplish. We logically split our terrain mesh into smaller grid cells which are 32x32 world units. Each cell has meta data that dictates the four 256x256 textures that are used for spatting along with the alpha blend data, shadow, and light mappings. Each cell contains 81 vertices in a 9x9 grid. Presently, we examine each cell and determine the four textures that are being used to spat the cell. We combine that geometry with any other cell that perhaps uses the same four textures regardless of spat order. If the spat order for a cell differs, the blend map is adjusted so that the spat order is maintained the same as other like cells and blending happens in the right order too. But even with this batching approach, it isn't uncommon when looking out across an area of open terrain to have between 1200-1700 batch count depending upon how frequently textures differ or have different texture blends are between cells. We are only doing frustum culling presently. So using texture spatting, are there other alternatives that can reduce the batch count and allow rendering to be extremely performance-friendly even under DirectX9c? We considered using texture atlases since we're targeting DirectX 9c & older OpenGL platforms but trying to repeat textures using atlases and shaders result in seam artifacts which we haven't been able to eliminate with the exception of disabling mipmapping. Disabling mipmapping results in poor quality textures from a distance. How have others batched together terrain geometry such that one could spat terrain using various textures, minimizing batch count and texture state switches so that rendering performance isn't negatively impacted?

    Read the article

  • Realtime file-level mirroring from local NTFS to network drive

    - by hurfdurf
    We have some data collection machines running WinXP. After a new file is written, we would like to immediately copy the new file to network storage (a NetApp CIFS share) automagically. We need realtime or near realtime copies generated (copy upon filehandle close would be fine -- these are not long-running system logs). Two commercial applications I've found so far are MirrorFile and IBM's Tivoli CDP. Are there any reliable open source programs or simple ways to get Shadow Copy to do something similar? Bonus points if it runs as a service.

    Read the article

  • Creating disk snapshots in Windows 7

    - by Puneet Arora
    Does anyone know of a command or tool to create disk snapshots on Windows 7 (client SKU)? I see vssadmin.exe has a "create shadow" option, but that's available only on server SKUs: http://technet.microsoft.com/en-us/library/cc788055(v=ws.10).aspx I've a backup tool that replicates changes (creations, modifications and deletions to files and directories) since last backup, to my backup volume. Before each time this happens I want to create a persistent snapshot on my backup volume. I could then mount previous snapshots to view previous backups achieving a behavior similar to that of TimeMachine in OS X. This question has been asked before but unfortunately there weren't any good answers: Taking snapshots of filesystem/volume in Windows 7?

    Read the article

  • Accessing a storage-side snapshot of a cluster-shared volume

    - by syneticon-dj
    From time to time I am in the situation where I need to get data back from storage-side snapshots of cluster shared volumes. I suppose I just never figured out a way to do it right, so I always needed to: expose the shadow copy as a separate LUN offline the original CSV in the cluster un-expose the LUN carrying the original CSV make sure my cluster nodes have detected the new LUN and no longer list the original one add the volume to the list of cluster volumes, promote it to be a CSV copy off the data I need undo steps 5. - 1. to revert to the original configuration This is quite tedious and requires downtime for the original volume. Is there a better way to do this without involving a separate host outside of the cluster?

    Read the article

  • Rendering design. How can I effectively deal with forward, deferred and transparent rendering?

    - by user1423893
    I have many objects in my game world that all derive from one base class. Each object will have different materials and will therefore be required to be drawn using various rendering techniques. I currently use the following order for rendering my objects. Deferred Forward Transparent (order independent) Each object has a rendering flag that denotes which one of the above methods should be used. The list of base objects in the scene are then iterated through and added to separate lists of deferred, forward or transparent objects based on their rendering flag value. The individual lists are then iterated through and drawn using the order above. Each list is cleared at the end of the frame. This methods works fairly well but it requires different draw methods for each material type. For example each object will require the following methods in order to be compatible with the possible flag settings. object.DrawDeferred() object.DrawForward() object.DrawTransparent() It is also hard to see where methods outside of materials, such as rendering shadow maps, would fit using this "flag & method" design. object.DrawShadow() I was hoping that someone may have some suggestions for improving this rendering process, possibly making it more generic and less verbose?

    Read the article

  • What is needed for 'Previous Versions' to be visible on the client OS?

    - by Zoredache
    I have servers with Shadow Copies enabled taking snapshots a couple times a day. From the server, if you look at the local devices you can see the Previous Versions being populated reliably. But from remote clients, the ability for an end-user to see the Previous Versions seems to be very hit-or-miss. For the sake of this question you can assume that all my clients are Windows 7 and the Servers are Windows Server 2008 R2. Is there an exhaustive list of everything that is required for end user to see Previous Versions? Are their any requirements for a certain level of share or filesystem permissions, other then read access? Does something need to be open on the firewall, other then what is already in-place for normal Windows networking?

    Read the article

  • No access to Samba shares

    - by koanhead
    I have three shared folders in my local home directory- that is to say, on my Ubuntu desktop's /home/me/. All were set up using "Sharing Options" in Nautilus' right-click menu. The standard "Music" and "Videos" folders are configured identically: the "Guest Access" box is checked, but the "Allow others to create and delete" is not. The third folder, called "shared", is configured to not allow Guest access but to allow others to modify files. I have not altered /etc/samba/smb.conf by hand, I have only used Sharing Options to create and modify these so-called "shares". My roommates have two Windows 7 computers and one Ubuntu Netbook Remix netbook. I have the aforementioned desktop machine and laptop running 10.04. None of these machines can access any of the shares. Attempts to access the Guest shares result in the message \\machine\directory is not accessible. The network name could not be found. This is the error message generated by a VM running Windows 2000. The other Windows machines generate a similar error. The Ubuntu laptop gives the error Unable to mount location: Failed to mount Windows share. Hurrah, once again, for informative error messages. That really helps a lot. When attempting to browse the folder called "shared" from the laptop, I'm confronted with a password dialog. This behavior is the same will all machines I've tried in the situation. On entering my username and password for the account to which the shares belong, the password dialog briefly disappears and is replaced with an identical dialog. No error message, useful or not, appears. When attempting to browse this folder with the VM, the outcome is the same except that the password dialog helpfully states "incorrect username or password". My assumption is that the username and password in question is that of the user which owns the shares. I have tried all other username and password combinations available in this context and the outcome is the same. I would like to be able to share files. Sharing them with Windows machines is a nice feature, or would be if it was available. Really I consider sharing files between two machines with the same version of the same operating system kind of a minimum condition for network usability. Samba last functioned reliably for me more than ten years ago. I have attempted to use it on and off since then with only intermittent success. Oh, and "Personal File Sharing" from the Preferences menu does not result in an entry in Places → Network → my-server. In fact, the old entry "MY-SERVER" goes away and is replaced by "koanhead's public files on my-server", which when I attempt to open it from the laptop gives a "DBus.Error.NoReply: Message did not receive a reply." I know I come here and gripe about Ubuntu a lot, but on the other hand I spend literally hours every day trying to fix things in Ubuntu. It's a good system which aspires to greatness, which is why things like this either Need to work; or Be adequately documented. Ideally both would be the case. Anyway, rant over. Hopefully someone will have some insight on this issue. Thanks all who bother to read this wall o'text for your time.

    Read the article

  • No access to Samba shares

    - by koanhead
    I have three shared folders in my local home directory- that is to say, on my Ubuntu desktop's /home/me/. All were set up using "Sharing Options" in Nautilus' right-click menu. The standard "Music" and "Videos" folders are configured identically: the "Guest Access" box is checked, but the "Allow others to create and delete" is not. The third folder, called "shared", is configured to not allow Guest access but to allow others to modify files. I have not altered /etc/samba/smb.conf by hand, I have only used Sharing Options to create and modify these so-called "shares". My roommates have two Windows 7 computers and one Ubuntu Netbook Remix netbook. I have the aforementioned desktop machine and laptop running 10.04. None of these machines can access any of the shares. Attempts to access the Guest shares result in the message \\machine\directory is not accessible. The network name could not be found. This is the error message generated by a VM running Windows 2000. The other Windows machines generate a similar error. The Ubuntu laptop gives the error Unable to mount location: Failed to mount Windows share. Hurrah, once again, for informative error messages. That really helps a lot. When attempting to browse the folder called "shared" from the laptop, I'm confronted with a password dialog. This behavior is the same will all machines I've tried in the situation. On entering my username and password for the account to which the shares belong, the password dialog briefly disappears and is replaced with an identical dialog. No error message, useful or not, appears. When attempting to browse this folder with the VM, the outcome is the same except that the password dialog helpfully states "incorrect username or password". My assumption is that the username and password in question is that of the user which owns the shares. I have tried all other username and password combinations available in this context and the outcome is the same. I would like to be able to share files. Sharing them with Windows machines is a nice feature, or would be if it was available. Really I consider sharing files between two machines with the same version of the same operating system kind of a minimum condition for network usability. Samba last functioned reliably for me more than ten years ago. I have attempted to use it on and off since then with only intermittent success. Oh, and "Personal File Sharing" from the Preferences menu does not result in an entry in Places → Network → my-server. In fact, the old entry "MY-SERVER" goes away and is replaced by "koanhead's public files on my-server", which when I attempt to open it from the laptop gives a "DBus.Error.NoReply: Message did not receive a reply." I know I come here and gripe about Ubuntu a lot, but on the other hand I spend literally hours every day trying to fix things in Ubuntu. It's a good system which aspires to greatness, which is why things like this either Need to work; or Be adequately documented. Ideally both would be the case. Anyway, rant over. Hopefully someone will have some insight on this issue. Thanks all who bother to read this wall o'text for your time.

    Read the article

  • Moving cpanel backup of magento site to VPS

    - by user2564024
    I was having my site in shared hosting, I took the entire backup, its structure is like addons homedir mysql resellerpackages suspendinfo bandwidth homedir_paths mysql.sql sds userconfig counters httpfiles mysql-timestamps sds2 userdata cp locale nobodyfiles shadow va cron logaholic pds shell vad digestshadow logs proftpdpasswd ssl version dnszones meta psql sslcerts vf domainkeys mm quota ssldomain fp mma resellerconfig sslkeys has_sslstorage mms resellerfeatures suspended Now I have subscribed to vps, I have copied the files inside homedir/public_html to var/www/html of my new hosting, but am seeing the following error when I view it browser, There has been an error processing your request Exception printing is disabled by default for security reasons. Error log record number: 259343920016 I have just created database with name magenhto inside mysql. Previously I had cpanel and used one click installer. Hence am not aware of how to use that data inside mysql to this new system and are there any more changes.

    Read the article

  • Backup all plesk MySQL Databases to individual files

    - by Michael
    Hy, Because I'm new to shell scripting I need a hand. I currently backup all mydatabases to a single file, thing that makes the restore preaty hard. The second problem that my MySQL password dosen't work because of a Plesk bug and i get the password from "/etc/psa/.psa.shadow". Here is the code that I use to backup all my databases to a single file. mysqldump -uadmin -p`cat /etc/psa/.psa.shadow` --all-databases | bzip2 -c > /root/21.10.2013.sql.bz2 I found some scripts on the web that backup each database to individual files but I don't know how to make them work for my situation. Here is a example script: for db in $(mysql -e 'show databases' -s --skip-column-names); do mysqldump $db | gzip > "/backups/mysqldump-$(hostname)-$db-$(date +%Y-%m-%d-%H.%M.%S).gz"; done Can someone help me make the script above work for my situation? Requirements: Backup each database to individual file using plesk password location.

    Read the article

  • How best to automatically deal with multiple site_ruby locations?

    - by cclark
    Is there a way to automatically append to $: variable in ruby to account for additional site_ruby locations? Ruby is installed in /usr/local/ and using gem_install will properly install the new ruby files in to /usr/local/lib/ruby/site_ruby. However there are some RPMs for ruby bindings to tools like shadow which we'd like to install and they install to /usr/lib/ruby/site_ruby (no local). Is there a standard way to tell ruby that this directory should also be included by default? I know scripts could dynamically update $: or they could be called with -I but it seems like this is something that should be handled in the install. Has anyone else found a clean way around this kind of problem? thanks, chuck

    Read the article

  • Blurred refurbished TFT

    - by PeterMmm
    I got 3 refurbished PC+TFT 19". All TFT shows a very blurry image. I estimate the TFT's are 2-3 yrs old. The PC's running Windows7 Pro. The resolution is set to the TFT native values. It could be possible that all TFT's are broken but I have similiar models that are up to 5 yrs old, without any issue. I still think it could be a config issue but from the hardware it is possible that a TFT get broken I shows up a very blurry image. Update TFT HP 2035, grafic Intel Q35/GMA 3100, analog D-SUB connector, Manuf. date Sep 2005. Config. resolution 1600x1200, PPP 150% Without ClearType it is worst. Desktop Icon titles seems to be good and clear. But in Notepad for example the effect is that on the right of the characters is a pinky shadow.

    Read the article

  • Multiple Zend application code organisation

    - by user966936
    For the past year I have been working on a series of applications all based on the Zend framework and centered on a complex business logic that all applications must have access to even if they don't use all (easier than having multiple library folders for each application as they are all linked together with a common center). Without going into much detail about what the project is specifically about, I am looking for some input (as I am working on the project alone) on how I have "grouped" my code. I have tried to split it all up in such a way that it removes dependencies as much as possible. I'm trying to keep it as decoupled as I logically can, so in 12 months time when my time is up anyone else coming in can have no problem extending on what I have produced. Example structure: applicationStorage\ (contains all applications and associated data) applicationStorage\Applications\ (contains the applications themselves) applicationStorage\Applications\external\ (application grouping folder) (contains all external customer access applications) applicationStorage\Applications\external\site\ (main external customer access application) applicationStorage\Applications\external\site\Modules\ applicationStorage\Applications\external\site\Config\ applicationStorage\Applications\external\site\Layouts\ applicationStorage\Applications\external\site\ZendExtended\ (contains extended Zend classes specific to this application example: ZendExtended_Controller_Action extends zend_controller_Action ) applicationStorage\Applications\external\mobile\ (mobile external customer access application different workflow limited capabilities compared to full site version) applicationStorage\Applications\internal\ (application grouping folder) (contains all internal company applications) applicationStorage\Applications\internal\site\ (main internal application) applicationStorage\Applications\internal\mobile\ (mobile access has different flow and limited abilities compared to main site version) applicationStorage\Tests\ (contains PHP unit tests) applicationStorage\Library\ applicationStorage\Library\Service\ (contains all business logic, services and servicelocator; these are completely decoupled from Zend framework and rely on models' interfaces) applicationStorage\Library\Zend\ (Zend framework) applicationStorage\Library\Models\ (doesn't know services but is linked to Zend framework for DB operations; contains model interfaces and model datamappers for all business objects; examples include Iorder/IorderMapper, Iworksheet/IWorksheetMapper, Icustomer/IcustomerMapper) (Note: the Modules, Config, Layouts and ZendExtended folders are duplicated in each application folder; but i have omitted them as they are not required for my purposes.) For the library this contains all "universal" code. The Zend framework is at the heart of all applications, but I wanted my business logic to be Zend-framework-independent. All model and mapper interfaces have no public references to Zend_Db but actually wrap around it in private. So my hope is that in the future I will be able to rewrite the mappers and dbtables (containing a Models_DbTable_Abstract that extends Zend_Db_Table_Abstract) in order to decouple my business logic from the Zend framework if I want to move my business logic (services) to a non-Zend framework environment (maybe some other PHP framework). Using a serviceLocator and registering the required services within the bootstrap of each application, I can use different versions of the same service depending on the request and which application is being accessed. Example: all external applications will have a service_auth_External implementing service_auth_Interface registered. Same with internal aplications with Service_Auth_Internal implementing service_auth_Interface Service_Locator::getService('Auth'). I'm concerned I may be missing some possible problems with this. One I'm half-thinking about is a config.ini file for all externals, then a separate application config.ini overriding or adding to the global external config.ini. If anyone has any suggestions I would be greatly appreciative. I have used contextswitching for AJAX functions within the individual applications, but there is a big chance both external and internal will get web services created for them. Again, these will be separated due to authorization and different available services. \applicationstorage\Applications\internal\webservice \applicationstorage\Applications\external\webservice

    Read the article

  • Pros and Cons of Facebook's React vs. Web Components (Polymer)

    - by CletusW
    What are the main benefits of Facebook's React over the upcoming Web Components spec and vice versa (or perhaps a more apples-to-apples comparison would be to Google's Polymer library)? According to this JSConf EU talk and the React homepage, the main benefits of React are: Decoupling and increased cohesion using a component model Abstraction, Composition and Expressivity Virtual DOM & Synthetic events (which basically means they completely re-implemented the DOM and its event system) Enables modern HTML5 event stuff on IE 8 Server-side rendering Testability Bindings to SVG, VML, and <canvas> Almost everything mentioned is being integrated into browsers natively through Web Components except this virtual DOM concept (obviously). I can see how the virtual DOM and synthetic events can be beneficial today to support old browsers, but isn't throwing away a huge chunk of native browser code kind of like shooting yourself in the foot in the long term? As far as modern browsers are concerned, isn't that a lot of unnecessary overhead/reinventing of the wheel? Here are some things I think React is missing that Web Components will care of. Correct me if I'm wrong. Native browser support (read "guaranteed to be faster") Write script in a scripting language, write styles in a styling language, write markup in a markup language. Style encapsulation using Shadow DOM React instead has this, which requires writing CSS in JavaScript. Not pretty. Two-way binding

    Read the article

  • Syncronize Linux /etc/ directory

    - by entend
    I have virtual machine with Linux (Ubuntu server) which is used as prototype for other machines. Sometimes I make changes in prototype system and want to import this changes at some other machine. I know about Puppet, cfengine and FAI but want something easy for example rsync script which will work through ssh when it needed. Main goal is /etc/ directory. But I don't want to syncronize some private files for example /etc/passwd /etc/shadow and so on. I don't know all of it. Are there tips for my task ? May be someone have such rsync script.

    Read the article

  • Remote Assist Windows 2008 R2 Session

    - by cbergman
    As mentioned in this kb article, it is not possible to shadow a session with multiple monitors enabled. It does mention that "Remote Assistance supports multiple monitors, and is the presently recommended solution if you need this functionality." My question is, is it possible (and if so, how) to remote assist a terminal services session. We have a few Windows Server 2008 R2 Terminal Servers running with about 10 or so users that use 2 monitors. It would be ideal to be able to remote into their session but have yet to find a viable solution. Thank you.

    Read the article

  • How to pre-load all deployed assemblies for an AppDomain

    - by Andras Zoltan
    Given an App Domain, there are many different locations that Fusion (the .Net assembly loader) will probe for a given assembly. Obviously, we take this functionality for granted and, since the probing appears to be embedded within the .Net runtime (Assembly._nLoad internal method seems to be the entry-point when Reflect-Loading - and I assume that implicit loading is probably covered by the same underlying algorithm), as developers we don't seem to be able to gain access to those search paths. My problem is that I have a component that does a lot of dynamic type resolution, and which needs to be able to ensure that all user-deployed assemblies for a given AppDomain are pre-loaded before it starts its work. Yes, it slows down startup - but the benefits we get from this component totally outweight this. The basic loading algorithm I've already written is as follows. It deep-scans a set of folders for any .dll (.exes are being excluded at the moment), and uses Assembly.LoadFrom to load the dll if it's AssemblyName cannot be found in the set of assemblies already loaded into the AppDomain (this is implemented inefficiently, but it can be optimized later): void PreLoad(IEnumerable<string> paths) { foreach(path p in paths) { PreLoad(p); } } void PreLoad(string p) { //all try/catch blocks are elided for brevity string[] files = null; files = Directory.GetFiles(p, "*.dll", SearchOption.AllDirectories); AssemblyName a = null; foreach (var s in files) { a = AssemblyName.GetAssemblyName(s); if (!AppDomain.CurrentDomain.GetAssemblies().Any( assembly => AssemblyName.ReferenceMatchesDefinition( assembly.GetName(), a))) Assembly.LoadFrom(s); } } LoadFrom is used because I've found that using Load() can lead to duplicate assemblies being loaded by Fusion if, when it probes for it, it doesn't find one loaded from where it expects to find it. So, with this in place, all I now have to do is to get a list in precedence order (highest to lowest) of the search paths that Fusion is going to be using when it searches for an assembly. Then I can simply iterate through them. The GAC is irrelevant for this, and I'm not interested in any environment-driven fixed paths that Fusion might use - only those paths that can be gleaned from the AppDomain which contain assemblies expressly deployed for the app. My first iteration of this simply used AppDomain.BaseDirectory. This works for services, form apps and console apps. It doesn't work for an Asp.Net website, however, since there are at least two main locations - the AppDomain.DynamicDirectory (where Asp.Net places it's dynamically generated page classes and any assemblies that the Aspx page code references), and then the site's Bin folder - which can be discovered from the AppDomain.SetupInformation.PrivateBinPath property. So I now have working code for the most basic types of apps now (Sql Server-hosted AppDomains are another story since the filesystem is virtualised) - but I came across an interesting issue a couple of days ago where this code simply doesn't work: the nUnit test runner. This uses both Shadow Copying (so my algorithm would need to be discovering and loading them from the shadow-copy drop folder, not from the bin folder) and it sets up the PrivateBinPath as being relative to the base directory. And of course there are loads of other hosting scenarios that I probably haven't considered; but which must be valid because otherwise Fusion would choke on loading the assemblies. I want to stop feeling around and introducing hack upon hack to accommodate these new scenarios as they crop up - what I want is, given an AppDomain and its setup information, the ability to produce this list of Folders that I should scan in order to pick up all the DLLs that are going to be loaded; regardless of how the AppDomain is setup. If Fusion can see them as all the same, then so should my code. Of course, I might have to alter the algorithm if .Net changes its internals - that's just a cross I'll have to bear. Equally, I'm happy to consider SQL Server and any other similar environments as edge-cases that remain unsupported for now. Any ideas!?

    Read the article

  • jQuery pop up problems

    - by user327137
    Hi all, I am creating a site from a template i purchased from TM for a beauty salon! I want to create an online booking form with the validations of name number service type but i'm having trouble getting a link to open that will pop up also using jquery NOT html how do i fix this... what is the code i have to insert so that when you click "BOOK NOW" a jquery pop up appears in the centre of the page and it has a booking form on it.... i have googled and googled but it is all new to me as in a NOOB at jquery.... here is a live demo of the template (template link for demo http://osc4.template-help.com/wt_31562/index.html#) and here is the code for where i am trying to place a pop up jquery <dt class="dt3"><a href="#"></a><img src="images/shadow.png" alt="" class="shadow"></dt> <dd id="page3"> <div class="inner"> <div class="content"> <section class="col-1"> <h2>our services</h2> <p>Vintage Beauty</p> <p class="dark">We offer Free Consultation for Botox, Fillers, Medical Skin Peels, Cosmetic Surgery <br> & also specialise n body and skin care. </p> <img src="images/page2-img1.png" alt="" class="p2"> <a href="#" class="more">view more</a> </section> <section class="col-2"> <h2>services</h2> <ul class="list p2"> <li><a href="#">Fish Pedicures</a></li> <li><a href="#">Manicures</a></li> <li><a href="#">Pedicures</a></li> <li><a href="#">Waxing</a></li> <li><a href="#">Threading</a></li> <li><a href="#">Tanning</a></li> <li><a href="#">Body Massage</a></li> <li><a href="#">Nail/Eye Extensions</a></li> <li><a href="#">Eye Lash/Brow Tinting</a></li> <li><a href="#">Twinkle Toes</a></li> <li><a href="#">Teeth Whitening Kits</a></li> <li><a href="#">Hot Wax Specialists</a></li> </ul> **<a href="#" class="more">BOOK ONLINE NOW</a> </section>** </div> </div> </dd>

    Read the article

  • How to organize pictures on website using css? [on hold]

    - by user3624023
    Here is my website without any CSS: http://www.wmcicompsci.ca/cs20/students/theglowcloud/Bare%20bones%20website/classics_bare.html I am new to CSS and I would like to organize pictures these pictures in this fashion: http://css-tricks.com/examples/SlideinCaptions/ I would just like this layout for the pictures but I do not need the sliding of the captions(although I would like to but it does not work my browser). I would like the captions to be like titles on top of the pictures. Here is my current html code: <!DOCTYPE html> <html> <head> <title> My favourite Fantasy books</title> <meta charset = "utf-8"> <link rel="stylesheet" href="css.css"> </head> <body> <nav id="main_nav"> <ul> <li><a href = " homepage_css.html"> Homepage</a></li> <li><a href="science_fiction_css.html">Science Fiction</a></li> <li><a href="classics_css.html">Classics</a></li> <li><a href="fantasy_css.html">Fantasy</a></li> </ul> </nav> <h1> Fantasy Genre</h1> <p> Here are my favourites:</p> <ul> <li> Goblet of Fire by J.K Rowling (4th book in the Harry Potter Series) </li> <li><img class= displayed src="pics/fantasy/goblet_of_fire.jpg" width="200" alt="Goblet of Fire book cover"></li> <li> Graceling by Kristan Cashore </li> <li><img src="pics/fantasy/graceling.jpg" width="200" alt = " Graceling book cover"></li> <li> Serpent's Shadow by Rick Riordan (3rd book in the Kane Chronicles) </li> <li><img src="pics/fantasy/serpents_shadow.jpg" width="200" alt="Serpent's Shadow book cover"></li> <li> The Hobbit by J.R.R. Tolkein </li> <li><img src="pics/fantasy/the_hobbit.jpg" width="200" alt="The Hobbit book cover"></li> <li> The False Prince by Jennifer Neilson (1st book in the Ascendance Triology) </li> <li><img src="pics/fantasy/the_false_prince.jpg" width="200" alt="The False Prince book cover"></li> </ul>

    Read the article

  • Configuring xUnit test output in Hudson

    - by graham.reeds
    I have a simple PoC project in Hudson. The PoC has unit tests written via UnitTest++ and outputs the results as XML for consumption by xUnit to munge into jUnit format. Here are the salient relevant I have my project configured to use MSBuild to build the 2008 solution. The project contains both the dll it is to build and the unit tests which are run as a post-build step. My workspace in Hudson is set to c:\develop\money (Money is the name of the project) and in the Hudson console I can see the workspace folders, the solution file and output folders (/bin, /doc, etc). The test console app outputs its file 'money_unit_tests.xml' to the folder 'reports' (making c:\develop\money\reports). I've restarted Hudson since installing xUnit and setting the workspace. However when I start Hudson building I am given the following message: [xUnit] Starting to record. [xUnit] [UnitTest] - Use the embedded style sheet. [xUnit] [ERROR] - No test report file(s) were found with the pattern 'reports/money_unit_tests.xml' relative to 'C:\.hudson\jobs\Money\workspace' for the testing framework 'UnitTest'. Did you enter a pattern relative to the correct directory? Did you generate the result report(s) for 'UnitTest'? [xUnit] Stopping recording. Finished: FAILURE Why does Hudson seem to think the workspace is in C:.hudson... and not C:\Develop...? What can I do change it? If I can't change it, what can I do to mitigate these changes? (I don't exactly want to hardcode the output for the xml to C:.hudson...)

    Read the article

  • Ajax Minifier Visual Studio include all javascript files

    - by Michael
    I am using the Ajax Minifier http://www.ajaxprojects.com/ajax/tutorialdetails.php?itemid=766 and have embedded it in the csproj file for use in Visual Studio 2008 (not the free version). I have two folders, Content and Scripts, directly under the root of the project. Also, the Content folder has subfolders, and would like to include all of these as well (if I have to manually add each subfolder that is fine as well). Currently, my csproj file looks like this (and is included within the Project tags as instructed). There are no build errors, the files simply do not get minified. (I've enabled Project - View All files) <Import Project="$(MSBuildExtensionsPath)\Microsoft\MicrosoftAjax\ajaxmin.tasks" /> <Target Name="AfterBuild"> <ItemGroup> <JS Include="Scripts\*.js" Exclude="Scripts\*.min.js;"/> <JS Include="Content\**\*.js" Exclude="Content\**\*.min.js;"/> </ItemGroup> <AjaxMin SourceFiles="@(JS)" SourceExtensionPattern="\.js$" TargetExtension=".min.js" /> </Target> How would I edit the csproj file in order to include these folders?

    Read the article

< Previous Page | 91 92 93 94 95 96 97 98 99 100 101 102  | Next Page >