I'm having trouble getting an AWS box up and running using Puphpet, Vagrant on Windows 7. The box gets created and I can SSH in to it (including vagrant SSH), but it seems there is a problem when Vagrant attempts to set up a synced folder (I have tried disabling synced_folder as per my config below, to no avail - same error).
Output from Vagrant:
$ vagrant up
Bringing machine 'default' up with 'aws' provider...
==> default: Preparing SMB shared folders...
default: You will be asked for the username and password to use for the SMB
default: folders shortly. Please use the proper username/password of your
default: Windows account.
default:
default: Username: Will
default: Password (will be hidden):
==> default: Warning! The AWS provider doesn't support any of the Vagrant
==> default: high-level network configurations (`config.vm.network`). They
==> default: will be silently ignored.
==> default: Launching an instance with the following settings...
==> default: -- Type: t1.micro
==> default: -- AMI: ami-a850c898
==> default: -- Region: us-west-2
==> default: -- Keypair: blerp
==> default: -- Security Groups: ["blerp"]
==> default: -- Block Device Mapping: []
==> default: -- Terminate On Shutdown: false
==> default: -- Monitoring: false
==> default: -- EBS optimized: false
==> default: -- Assigning a public IP address in a VPC: false
==> default: Waiting for instance to become "ready"...
==> default: Waiting for SSH to become available...
==> default: Machine is booted and ready for use!
==> default: Mounting SMB shared folders...
We couldn't detect an IP address that was routable to this
machine from the guest machine! Please verify networking is properly
setup in the guest machine and that it is able to access this
host.
As another option, you can manually specify an IP for the machine
to mount from using the `smb_host` option to the synced folder.
config.yaml
---
vagrantfile-aws:
vm:
box: aws
hostname: master
network:
private_network: 192.168.56.102
forwarded_port: { }
provider:
aws:
access_key_id: AKIAIXXXXXXXXXXXXXXEA
secret_access_key: F/Jbzz8XXXXXXXXXXXXXXXXXXXXXXXXXXI
keypair_name: blerp
ami: ami-a850c898
region: us-west-2
instance_type: t1.micro
security_groups:
- blerp
tags:
Source: Puphpet
provision:
puppet:
manifests_path: puphpet/puppet
manifest_file: site.pp
module_path: puphpet/puppet/modules
options:
- '--verbose'
- '--hiera_config /vagrant/puphpet/puppet/hiera.yaml'
- '--parser future'
synced_folder: { }
ssh:
host: null
port: null
private_key_path: C:\keys\blerp.pem
username: admin
guest_port: null
keep_alive: true
forward_agent: false
forward_x11: false
shell: 'bash -l'
vagrant:
host: detect
server:
install: '1'
packages:
- htop
- vim
users_groups:
install: '1'
groups:
- blerp
users:
- vagrant
cron:
install: '1'
jobs: { }
firewall:
install: '1'
rules: null
apache:
install: 0
settings:
user: www-data
group: www-data
default_vhost: true
manage_user: false
manage_group: false
sendfile: 0
modules:
- rewrite
vhosts:
aigqp4eo8lau:
servername: awesome.dev
serveraliases:
- www.awesome.dev
docroot: /var/www/awesome
port: '80'
setenv:
- 'APP_ENV dev'
directories:
nxinqq2xcvog:
provider: directory
path: /var/www/awesome
options:
- Indexes
- FollowSymlinks
- MultiViews
allow_override:
- All
require:
- all
- granted
custom_fragment: ''
engine: php
custom_fragment: ''
ssl_cert: ''
ssl_key: ''
ssl_chain: ''
ssl_certs_dir: ''
mod_pagespeed: 0
nginx:
install: '1'
settings:
default_vhost: 1
proxy_buffer_size: 128k
proxy_buffers: '4 256k'
upstreams: { }
vhosts:
nzva8cncvz1v:
proxy: ''
server_name: pms.dev
server_aliases:
- www.awesome.dev
www_root: /var/www/blerp
listen_port: '80'
location: \.php$
index_files:
- index.html
- index.htm
- index.php
envvars:
- 'APP_ENV dev'
engine: php
client_max_body_size: 1m
ssl_cert: ''
ssl_key: ''
php:
install: '1'
version: '56'
composer: '1'
composer_home: ''
modules:
php:
- cli
- intl
- mcrypt
pear: { }
pecl:
- pecl_http
ini:
display_errors: On
error_reporting: '-1'
session.save_path: /var/lib/php/session
timezone: Europe/London
mod_php: 0
hhvm:
install: '0'
nightly: 0
composer: '1'
composer_home: ''
settings:
host: 127.0.0.1
port: '9000'
ini:
display_errors: On
error_reporting: '-1'
timezone: null
xdebug:
install: '0'
settings:
xdebug.default_enable: '1'
xdebug.remote_autostart: '0'
xdebug.remote_connect_back: '1'
xdebug.remote_enable: '1'
xdebug.remote_handler: dbgp
xdebug.remote_port: '9000'
xhprof:
install: '0'
wpcli:
install: '0'
version: v0.17.1
drush:
install: '0'
version: 6.3.0
ruby:
install: '1'
versions:
2zE2nPWS5zhS:
version: ''
nodejs:
install: '0'
npm_packages: { }
python:
install: '1'
packages: { }
versions:
ZWnnHGyd3QEG:
version: ''
mysql:
install: '1'
override_options: { }
root_password: secret
adminer: 0
databases:
DVuYzweWmPBc:
grant:
- ALL
name: blerp
host: localhost
user: blerp
password: secret
sql_file: ''
postgresql:
install: '0'
settings:
root_password: '123'
user_group: postgres
encoding: UTF8
version: '9.3'
databases: { }
adminer: 0
mariadb:
install: '0'
override_options: { }
root_password: '123'
adminer: 0
databases: { }
version: '10.0'
sqlite:
install: '0'
adminer: 0
databases: { }
mongodb:
install: '0'
settings:
auth: 1
port: '27017'
databases: { }
redis:
install: '0'
settings:
conf_port: '6379'
mailcatcher:
install: '0'
settings:
smtp_ip: 0.0.0.0
smtp_port: 1025
http_ip: 0.0.0.0
http_port: '1080'
mailcatcher_path: /usr/local/rvm/wrappers/default
from_email_method: inline
beanstalkd:
install: '0'
settings:
listenaddress: 0.0.0.0
listenport: '13000'
maxjobsize: '65535'
maxconnections: '1024'
binlogdir: /var/lib/beanstalkd/binlog
binlogfsync: null
binlogsize: '10485760'
beanstalk_console: 0
binlogdir: /var/lib/beanstalkd/binlog
rabbitmq:
install: '0'
settings:
port: '5672'
elastic_search:
install: '0'
settings:
version: 1.4.1
java_install: true
solr:
install: '0'
settings:
version: 4.10.2
port: '8984'
This sounds like a bug in either Vagrant or vagrant-aws.
Syncing folders to remote hosts (like AWS, Digital Ocean, etc) should use rsync, not SMB!
Maybe you should ask those projects in their github repos - you're more likely to get an answer there.
Source: I created puphpet, I do not believe this is a puphpet issue.