首页
学习
活动
专区
圈层
工具
发布
    • 综合排序
    • 最热优先
    • 最新优先
    时间不限
  • 来自专栏用户2442861的专栏

    Caffe学习:Blobs, Layers, and Nets

    -注意:网络结构是设备无关的,Blob和Layer=隐藏了模型定义的具体实现细节。定义网络结构后,可以通过Caffe::mode()或者Caffe::set_mode()在CPU和GPU模式间切换。Layer在CPU和GPU模式下运算的结果是一致的(忽略计算误差)。CPU和GPU模式间是无缝切换的,并且独立于模型定义。

    59200发布于 2018-09-19
  • 来自专栏hml_知识记录

    存储和使用流数据(BLOBs和CLOBs)

    存储和使用流数据(BLOBs和CLOBs) Intersystems SQL支持将流数据存储为Intersystems Iris ®DataPlatform数据库中的 BLOBs(二进制大对象)或 CLOBs BLOBs and CLOBs Intersystems SQL支持将BLOBs(二进制大对象)和CLOBs(字符大对象)存储为流对象的功能。 BLOBs用于存储二进制信息,例如图像,而CLOBs用于存储字符信息。 BLOBs和CLOBs可以存储多达4千兆字节的数据(JDBC和ODBC规范所强加的限制)。

    1.9K20编辑于 2022-06-07
  • 来自专栏AIUAI

    Caffe2 - (三) Blobs,Workspace,Tensors等概念

    Blobs,Workspace,Tensors Caffe2 的 Data 是以 blobs 的形式组织的. blob 即是内存中被命名的 data chunk(数据块). blobs 一般包含一个 tensor(可以看做是多维数组),在 Python 中的存在形式是 numpy arrays. Workspace 存储所有的 blobs. 如下例,展示了将 blobs 送入 workspace (Feed) 以及从 workspace 读取 blobs [Fetch]的方法. net 是由 operators 组成的 graph,每一个 operator 根据输入 input blobs 集,而输出一个或多个 output blobs. data 和 label blobs 的 first dim=16,即 batchsize=16.

    89320发布于 2019-02-18
  • 来自专栏python前行者

    sklearn提供的自带的数据集(make_blobs)

    生成数据集 生成数据集:可以用来分类任务,可以用来回归任务,可以用来聚类任务,用于流形学习的,用于因子分解任务的 用于分类任务和聚类任务的:这些函数产生样本特征向量矩阵以及对应的类别标签集合 make_blobs center=[[1,1],[-1,-1],[1,-1]] cluster_std=0.3 X,labels=make_blobs(n_samples=200,centers=center,n_features 单标签 make_blobs 产生多类数据集,对每个类的中心和标准差有很好的控制 输入参数: sklearn.datasets.samples_generator.make_blobs(n_samples centers = [(-3, -3),(3, 3)] cluster_std = [0.5,0.7] X,y = make_blobs(n_samples=1000, centers=centers = [(-3, -3),(0,0),(3, 3)] cluster_std = [0.5,0.7,0.5] X,y = make_blobs(n_samples=2000, centers=centers

    3.7K30发布于 2019-03-25
  • 来自专栏AIUAI

    Caffe2 - (三十二) Detectron 之 roi_data - 模型 minibatch blobs

    Caffe2 - (三十二) Detectron 之 roi_data - 模型 minibatch blobs 根据对应的 roi_data 模块可以处理 对应模型的 minibatch blobs. dict with Mask R-CNN blobs blobs['mask_rois'] = rois_fg blobs['roi_has_mask_int32'] = roi_has_mask rpn_blobs.items(): blobs[k].append(v) for k, v in blobs.items(): if isinstance [0] if len(blobs_out) == 1 else blobs_out 5. retinanet.py """ 计算训练 RetinaNet 网络的 minibatch blobs. """ blobs['retnet_fg_num'] = blobs['retnet_fg_num'].astype(np.float32) blobs['retnet_bg_num'] = blobs

    1.4K90发布于 2018-05-17
  • 来自专栏AIUAI

    Caffe2 - (二十七) Detectron 之 modeling - detector

    return blobs_out def GenerateProposalLabels(self, blobs_in): """ Python Op - 生成 - 输出blobs - blobs_out: - (blobs 的 variable set): 返回模型训练需要的 blobs. ([str(b) for b in blobs_in]) # 在运行前,blobs 列表是未知的,因为其由训练的指定模型来确定 )(blobs_in, blobs_out, name=name) return blobs_out def CollectAndDistributeFpnRpnProposals :' + ','.join([str(b) for b in blobs_in]) # 准备 output blobs blobs_out = roi_data.fast_rcnn.get_fast_rcnn_blob_names

    1.4K80发布于 2018-05-17
  • 来自专栏Python编程 pyqt matplotlib

    二维点集中查找凸边界点

    ): edge_points = List() # tolerance = 50 # 0 则结果为严格意义上的图边界 for i in range(len(blobs)-1): x1, y1 = blobs[i] # p1 for j in range(i+1, len(blobs)): # 排除重复 x2, y2 = = List(blobs) #import random #for i in range(20): #blobs.append((random.random()*10.0 , random.random()*8.0)) #blobs = List(blobs) edge_points = calc_boundary(blobs, tolerance=0.0 ) X = List([x for x, y in blobs]) Y = List([y for x, y in blobs]) X_ = [x for x, y in edge_points

    48630编辑于 2022-11-18
  • 来自专栏AIUAI

    Caffe2 - (三十三) Detectron 之 roi_data - data loader

    minibatch blobs 放入 GPU 的 blobs 队列. from __future__ import absolute_import from __future__ import division _blobs_queue_name = 'roi_blobs_queue_{}'.format(self. ordered_blobs[key] = blobs[key] coordinated_put(self.coordinator, self. _roidb[i] for i in db_inds] blobs, valid = get_minibatch(minibatch_db) return blobs _blobs_queue_capacity ) ) return self.create_enqueue_blobs() def close_blobs_queues(self

    1.2K40发布于 2018-05-17
  • 来自专栏AIUAI

    Caffe2 - (七)Caffemodel 转换为 Caffe2 pb 模型

    if (pretrained_blobs[0].ndim == 4 and list(pretrained_blobs[0].shape[:2]) ! [0].shape)) weight = utils.NumpyArrayToCaffe2Tensor( pretrained_blobs[0].reshape(-1, pretrained_blobs [0].shape[0] if pretrained_blobs[2][0] ! = 0: mean = utils.NumpyArrayToCaffe2Tensor( (1. / pretrained_blobs[2][0]) * pretrained_blobs pretrained_blobs[2][0] = 1 pretrained_blobs[2] = np.tile(pretrained_blobs[2], (n_channels, ))

    1.4K40发布于 2019-02-18
  • 来自专栏图像处理与模式识别研究所

    DoG斑点检测

    blob_doh im=cv2.imread('C:/Users/xpp/Desktop/Lena.png')#原始图像 im_gray=rgb2gray(im)#将彩色图片转换为灰度图片 dog_blobs =blob_dog(im_gray,max_sigma=30,threshold=0.1)#DoG斑点检测 dog_blobs[:,2]=sqrt(2)*dog_blobs[:,2] blobs_list =[dog_blobs] colors=['lime'] titles=['Difference of Gaussian'] sequence=zip(blobs_list,colors,titles) enumerate(sequence): axes[idx+1].imshow(im,interpolation='nearest') axes[idx+1].set_title('Blobs with'+title,size=30) for blob in blobs: y,x,row=blob col=pylab.Circle((x,y),row,

    58410编辑于 2022-05-28
  • 来自专栏翻译scikit-learn Cookbook

    Using KMeans to cluster data使用K均值来聚类数据

    from sklearn.datasets import make_blobs blobs, classes = make_blobs(500, centers=3) Also, since we'll Then we'll talk a little bit about how KMeans works to find the optimal number of blobs.Looking at our blobs, we can see that there are three distinct clusters: 我们将要通过简单的例子,用虚拟数据聚类成点集。 我们能看到这里有3个清晰的组: f, ax = plt.subplots(figsize=(7.5, 7.5)) ax.scatter(blobs[:, 0], blobs[:, 1], color=rgb _[:, 1], marker='*', s=250, color='black', label='Centers') ax.set_title("Blobs") ax.legend(loc='best

    99410发布于 2019-11-20
  • 来自专栏Kubernetes

    Registry GC源码分析

    run-dry GC demo run-dry will print the blobs to be deleted, but will not be deleted truely. marked are not to deleted, the blobs eligible are to delete. In the example above, you can see 4 blobs marked, 5 blobs eligible for deletion. When we make GC scheme in pro, we can decide whether to trigger GC according the number of blobs marked eligible. for example, when the number of blobs marked eligible more than 500, trigger the Registry

    73480发布于 2018-04-13
  • 来自专栏深度学习和计算机视觉

    图像处理:斑点检测和连接的组件

    blob_doh from math import sqrt import matplotlib.pyplot as plt import numpy as np 三种Blob检测方法的代码实现: blobs_log = blob_log(im_bw, max_sigma=30, num_sigma=10, threshold=.1) # Compute radii in the 3rd column. blobs_log [:, 2] = blobs_log[:, 2] * sqrt(2) #normalizing and scaling parameter so that it matches the blobs_dog = blob_dog(im_bw, max_sigma=30, threshold=.1) blobs_dog[:, 2] = blobs_dog[:, 2] * sqrt(2) blobs_doh = blob_doh(im_bw, max_sigma=30, threshold=.01) blobs_list = [blobs_log, blobs_dog, blobs_doh] colors

    1.5K10发布于 2021-03-12
  • 来自专栏图像处理与模式识别研究所

    LoG斑点检测

    =blob_log(im_gray,max_sigma=30,num_sigma=10,threshold=.1)#LoG斑点检测 blobs_log[:,2]=sqrt(2)*blobs_log[:, 2] blobs_list=[blobs_log] colors=['yellow'] titles=['Laplacian of Gaussian'] sequence=zip(blobs_list, interpolation='nearest') axes[0].set_title('original image',size=30),axes[0].set_axis_off() for idx, (blobs enumerate(sequence): axes[idx+1].imshow(im,interpolation='nearest') axes[idx+1].set_title('Blobs with'+title,size=30) for blob in blobs: y,x,row=blob col=pylab.Circle((x,y),row,

    75020编辑于 2022-05-28
  • 来自专栏AIUAI

    Caffe2 - (二十五) Detectron 之 utils 函数(3)

    = workspace.Blobs() with open(weights_file, 'r') as f: src_blobs = pickle.load(f) if be only blobs, now they are # stored under the 'blobs' key src_blobs = src_blobs['blobs # 将这些 blobs 加载到 CPU 内存里的 '__preserve__/' namescope. # 这些 blobs 也会被保存 model 到 weights file. blobs_per_gpu = int(len(all_blobs) / cfg.NUM_GPUS) for i in range(blobs_per_gpu): blobs = [p for p in all_blobs[i::blobs_per_gpu]] data = workspace.FetchBlob(blobs[0])

    71030发布于 2019-02-18
  • 来自专栏图像处理与模式识别研究所

    DoH斑点检测

    =blob_doh(im_gray,max_sigma=30,threshold=0.005)#DoH斑点检测 blobs_doh[:, 2]=sqrt(2)*blobs_doh[:,2] blobs_list =[blobs_doh] colors=['red'] titles=['Determinant of Hessian'] sequence=zip(blobs_list,colors,titles) interpolation='nearest') axes[0].set_title('original image',size=30),axes[0].set_axis_off() for idx, (blobs enumerate(sequence): axes[idx+1].imshow(im,interpolation='nearest') axes[idx+1].set_title('Blobs with'+title,size=30) for blob in blobs: y,x,row=blob col=pylab.Circle((x,y),row,

    96430编辑于 2022-05-28
  • 来自专栏本立2道生

    Caffe源码理解3:Layer基类与template method设计模式

    phase_ = param.phase(); if (layer_param_.blobs_size() > 0) { blobs_.resize(layer_param_.blobs_size ()); for (int i = 0; i < layer_param_.blobs_size(); ++i) { blobs_[i].reset(new Blob<Dtype >()); blobs_[i]->FromProto(layer_param_.blobs(i)); } } } virtual ~Layer() {} SetUp (); for (int i = 0; i < blobs_.size(); ++i) { blobs_[i]->ToProto(param->add_blobs(), write_diff 参考 Blobs, Layers, and Nets: anatomy of a Caffe model 虚函数与多态

    81020发布于 2018-12-27
  • 来自专栏AIUAI

    Caffe2 - (二十六) Detectron 之定制 Python Operators(ops)

    网络层的输入和输出 - Input Blobs & Output Blobs: - Input Blobs: [rpn_rois_fpn<min>, ..., rpn_rois_fpn 训练阶段使用时,Input Blobs 还包括:[roidb, im_info]. - 输出blobs - blobs_out: - (blobs 的 variable set): 返回模型训练需要的 blobs. = {k: [] for k in output_blob_names} roi_data.fast_rcnn.add_fast_rcnn_blobs(blobs, im_scales 网络层输入和输出 blobs - blobs_in 和 blobs_out - 输入 blobs_in: - rpn_cls_probs: 4D tensor, shape (N

    1.6K70发布于 2018-05-17
  • 来自专栏翻译scikit-learn Cookbook

    Optimizing the number of centroids最优化形心数量

    How to do it…怎么做 To get started we'll create several blobs that can be used to simulate clusters of data :为了模拟能够被使用的聚类数据的区块,我们将生成几个团状数据 from sklearn.datasets import make_blobs import numpy as np blobs, classes = make_blobs(500, centers=3) from sklearn.cluster import KMeans kmean = KMeans(n_clusters=3) kmean.fit (blobs) KMeans(algorithm='auto', copy_x=True, init='k-means++', max_iter=300, n_clusters=3, n_init # first new ground truth首先新的分类准确性 >>> blobs, classes = make_blobs(500, centers=10) >>> sillhouette_avgs

    62020发布于 2019-11-21
  • 来自专栏杨建荣的学习笔记

    MySQL中需要注意的字段长度问题

    The maximum row size for the used table type, not counting BLOBs, is 65535. The maximum row size for the used table type, not counting BLOBs, is 65535. The maximum row size for the used table type, not counting BLOBs, is 65535. The maximum row size for the used table type, not counting BLOBs, is 65535. The maximum row size for the used table type, not counting BLOBs, is 65535.

    2.9K60发布于 2018-03-21
领券